Sign In to Follow Application
View All Documents & Correspondence

Methods And Systems For Histopathology Stain Normalization

Abstract: ABSTRACT Methods and systems for histopathology stain normalization Embodiments herein disclose methods and systems for normalizing histopathological images, wherein the histopathological images can be normalized using a stain transfer generator network, a discriminator and a pre-trained Convolutional Neural Network (CNN). FIG. 2

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
24 July 2020
Publication Number
04/2022
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
patent@bananaip.com
Parent Application

Applicants

AIRAMATRIX PRIVATE LIMITED
801, Dosti Pinnacle, Road No.22, Wagle Industrial Estate, MIDC, Thane (West), Maharashtra, India, 400604

Inventors

1. Harshal Chandrakant Nishar
A - 303, Prajakta CHS Ltd., 51st Road, T.P.S.-3, Borivali (West), Mumbai - 400092, Maharashtra, India.
2. Anindya Hajra
95/2/2, Abinash Banerjee Lane, Shibpur, Howrah, Kolkata, India, Pin code-711102

Specification

DESC:CROSS REFERENCE TO RELATED APPLICATION
This application is based on and derives the benefit of Indian Provisional Application IN202021031736, the contents of which are incorporated herein by reference.
TECHNICAL FIELD
[001] Embodiments disclosed herein relate to processing of medical images and more particularly to stain normalization of histopathological images.
BACKGROUND
[002] Haemotoxylin and Eosin (H&E) stained Whole Slide Images (WSI) from different sources (such as laboratories, research institutes, medical institutions, and so on) look very different. FIGs. 1A, 1B and 1C depict example stained images of the kidney, lungs and liver from different sources respectively.
[003] Due to the variation in the slides even for similar cells, there can be an inherent data bias due to limited stain variability in training data. DL Models trained on images from a specific set of sources are typically not able to generalize well on stained slides from new sources.
OBJECTS
[004] The principal object of embodiments herein is to disclose methods and systems for normalizing histopathological images, wherein the histopathological images can be normalized using a stain transfer generator network, a discriminator and a pre-trained Convolutional Neural Network (CNN).
[005] These and other aspects of the embodiments herein will be better appreciated and understood when considered in conjunction with the following description and the accompanying drawings. It should be understood, however, that the following descriptions, while indicating at least one embodiment and numerous specific details thereof, are given by way of illustration and not of limitation. Many changes and modifications may be made within the scope of the embodiments herein without departing from the spirit thereof, and the embodiments herein include all such modifications.
BRIEF DESCRIPTION OF FIGURES
[006] Embodiments herein are illustrated in the accompanying drawings, throughout which like reference letters indicate corresponding parts in the various figures. The embodiments herein will be better understood from the following description with reference to the drawings, in which:
[007] FIGs. 1A, 1B and 1C depict example stained images of the kidney, lungs and liver from different labs respectively;
[008] FIG. 2 depicts the architecture of a system for normalizing histopathological images, according to embodiments as disclosed herein;
[009] FIG. 3 depicts examples of the HRNet, according to embodiments as disclosed herein;
[0010] FIG. 4 depicts the HRNET normalizing histopathological images, according to embodiments as disclosed herein;
[0011] FIG. 5 is a flowchart depicting the process of normalizing histopathological images, according to embodiments as disclosed herein;
[0012] FIG. 6 is a flowchart depicting a process of determining if style transfer is to be performed, according to embodiments as disclosed herein;
[0013] FIG. 7 is a flowchart depicting the process of style transfer, according to embodiments as disclosed herein;
[0014] FIGs. 8A-8E depict example kidney glomeruli segmentation models trained on a set of images, according to embodiments as disclosed herein;
[0015] FIG. 9 discloses the dice score for segmentation model with and without stain normalization, according to embodiments as disclosed herein;
[0016] FIG. 10A depicts an example comparison with the methods on a publicly available mitosis dataset, according to embodiments as disclosed herein; and
[0017] FIG. 10B depicts a comparison with perceptual similarity measure, according to embodiments as disclosed herein.
DETAILED DESCRIPTION
[0018] The embodiments herein and the various features and advantageous details thereof are explained more fully with reference to the non-limiting embodiments that are illustrated in the accompanying drawings and detailed in the following description. Descriptions of well-known components and processing techniques are omitted so as to not unnecessarily obscure the embodiments herein. The examples used herein are intended merely to facilitate an understanding of ways in which the embodiments herein may be practiced and to further enable those of skill in the art to practice the embodiments herein. Accordingly, the examples should not be construed as limiting the scope of the embodiments herein.
[0019] The embodiments herein achieve methods and systems for normalizing histopathological images. Referring now to the drawings, and more particularly to FIGS. 2 through 10B, where similar reference characters denote corresponding features consistently throughout the figures, there are shown embodiments.
[0020] Embodiments herein determine a transformation such that a stain style similarity measure which defines the similarity between two sets of stains is minimum and a content similarity measure between an input image and a transformed image is minimum for all the input stain images.
[0021] Consider that there is a set of reference stain images (R), a set of input stain images (P) and a set of transformed images (T). A transformation (G: P?T) has to be determined such that S(T,R) is minimum and C(G(p),p) is minimum ?p?P. S(.) is the stain style similarity measure which defines the similarity between two sets of stains. C(.) is the content similarity measure between the input image and the transformed image.
[0022] FIG. 2 depicts the architecture of a system for normalizing histopathological images. The system 200 comprises a modified High Resolution network (HRNet) 201, a pre-trained Convolutional Neural Network (CNN) 202, a discriminator 203, and a loss determination module 204.
[0023] At least one input image is provided to the HRNet 201. The at least one input image can be one image or a set of images. The HRNet 201 transforms the at least one input image. FIG. 3 depicts examples of the HRNet. The HRNet 201 preserves high-resolution features in the image and provides a better photo-realistic style transfer. The HRNet 201 can directly transfer the high-resolution features from layer to layer without downsampling, thus preserving the high-resolution features and providing better style transfer. The HRNet 201 comprises a direct skip connection from the input of the HRNet 201 to the output of the HRNet 201. The HRNet 201 learns residual transformation. The HRNet 201 learns residual transformation due to the skip connection from the input of the HRNet 201 to the output of the HRNet 201. The HRNet 201 provides faster training with better convergence. The HRnet 201 provides the transformed image to a pre-trained CNN (VGG-19) and a discriminator.
[0024] The CNN 202 is a pre-trained neural network VGG-16/19 on an ImageNet dataset. The CNN 202 is a feature extractor for style and content features. The CNN 202 using the transformed image, the at least one input image and a reference image, determines content loss and the style loss.
[0025] Let p be the input image, r be the reference image and t be the transformed image. F_ij^l corresponds to features from the l^th layer of the CNN 202 at the j^th location.
[0026] The CNN 202 can compute the style loss from the style representation obtained by passing the reference image and the transformed image through a pre-trained CNN (for example, VGG16 or VGG19). Style representations are the Gram matrices of CNN features at each layers. Each element of a Gram matrix at a location is given by the inner product of the vectorized feature maps. The CNN 202 can determine the style loss as
GramG_ik^l=?_j¦?F_ij^l F_kj^l ? (1)
e^l (r,t)=1/(N_l M_l ) ?_ik¦(G_ik^l (r)-G_ik^l (t))^2 (2)
L_style (r,t)=?_l¦???_l e?^l (r,t) ? (3)
[0027] The CNN 202 can compute the content loss from the content features obtained by passing the input image and the transformed image through a pre-trained CNN (for example, VGG16 or VGG19). The CNN 202 can determine the content loss as
L_content^l (p,t)=1/2 ?_ij¦(F_ij^l (p)-F_ij^l (t))^2 (4)
[0028] The discriminator 203 can distinguish between the original reference stain images and fake images generated using a stain transfer generator network. The discriminator 203 can minimize an adversarial loss. The discriminator 203, using the transformed image and a reference image (which can be images of one or more desired stains), can determine the adversarial loss. The discriminator 203 can iteratively minimize the total stain transfer generator loss (which is determined using a discriminator loss and the adversarial loss).
[0029] The discriminator loss can be used to distinguish between the original reference stain images and fake images generated using the stain transfer generator network. The discriminator loss is calculated as a log loss between the reference image and the transformed image. The discriminator 203 can determine the discriminator loss (L_dis) as
L_dis=log(1-D(r))+log(D(t)) (5)
[0030] Images have to be generated, that are similar to reference stain images, so that the discriminator cannot distinguish them from the real samples drawn from reference stain images. For this purpose, the adversarial loss has to be minimized, wherein the adversarial loss is defined as log of one minus transformed image. The discriminator 203 can determine the adversarial loss (L_adv) as
L_adv=log(1-D(t)) (6)
[0031] The total stain transfer generator loss (L_gen) can be determined by the loss determination module 204 using the adversarial loss (as determined by the discriminator 203), the content loss (as determined by the CNN 202) and the style loss (as determined by the CNN 202) as
L_gen=?_a L_adv+?_c L_content+?_s L_style (7)
[0032] The loss determination module 204 can provide the determined total stain transfer generator loss to the HRNet 201. As depicted in FIG. 4, the HRNet 201 can use the total stain transfer generator loss to normalize histopathological images to a desired stain, wherein the histopathological images are provided as input to the HRNet 201.
[0033] The HRNet 201 can determine if style transfer has to be performed for a new study. The HRNet 201 can determine that the style transfer has to be performed for a new study, if the stain for the new study is away from a reference stain by more than a pre-defined distance.
[0034] The HRNet 201 can prepare a set of reference tiles for each organ at a plurality of magnifications. In an embodiment herein, the tiles can cover the entire organ, including the areas of the organ with abnormal parameters. In an embodiment herein, the tiles can cover pre-defined portions of the organ, including the areas of the organ with both normal and abnormal parameters. For example, the HRNet 201 can prepare the tiles for each organ at magnifications of :1:1, 1:2, 1:4, and so on.
[0035] The HRNet 201 can perform style transfer for a stain. The HRNet 201 can select a set of tiles from the organ (which is the subject of the current study). The HRNet 201 can pair the reference tiles and the selected set of tiles, based on common/similar features/parameters. The HRNet 201 can train the Style Transfer model for the study using the pairs.
[0036] FIG. 5 is a flowchart depicting the process for normalizing histopathological images. In step 501, an input image is provided to the HRNet 201. In step 502, the HRNet 201 transforms the at least one input image. The transformed image comprises of preserved high-resolution features. The HRNet 201 provides a better photo-realistic style transfer by directly transferring the high-resolution features from layer to layer without downsampling. The HRNet 201 comprises a direct skip connection from the input of the HRNet 201 to the output of the HRNet 201, due to which the HRNet 201 learns residual transformation. In step 503, the HRnet 201 provides the transformed image to the pre-trained CNN (VGG-19) 202 and the discriminator 203.
[0037] In step 504, the CNN 202 determines the content loss and the style loss using the transformed image, the at least one input image and a reference image. In step 505, the discriminator 203 determines the adversarial loss and the discriminator loss. In step 506, the loss determination module 204 determines the total stain transfer generator loss using the adversarial loss, the content loss, the discriminator loss and the style loss. In step 507, the loss determination module 204 provides the determined total stain transfer generator loss to the HRNet 201. In step 508, the HRNet 201 uses the total stain transfer generator loss to normalize histopathological images to a desired stain. The various actions in method 500 may be performed in the order presented, in a different order or simultaneously. Further, in some embodiments, some actions listed in FIG. 5 may be omitted.
[0038] FIG. 6 is a flowchart depicting a process of determining if style transfer is to be performed. Consider that a study of a new stain has to be performed. In step 601, the stain for the new study is compared with a reference stain. The comparison of stains between the stain for the new study and the reference stain distribution may be in terms of a stain comparison metric. An example of the stain comparison metric can be Mahalanobis distance. In step 602, it is checked whether the stain for the new study is away from the reference stain by more than the pre-defined distance. In an example herein, the pre-defined distance (i.e., the threshold) can be twice the Standard Deviation from the Mean of Mahalanobis distance defined on the reference stain distribution. If the stain for the new study is away from the reference stain by more than the pre-defined distance, in step 603, style transfer is performed for the new stain. The various actions in method 600 may be performed in the order presented, in a different order or simultaneously. Further, in some embodiments, some actions listed in FIG. 6 may be omitted.
[0039] FIG. 7 is a flowchart depicting the process of style transfer. The HRNet 201 can perform style transfer for a stain. In step 701, the HRNet 201 selects a set of tiles from the organ (which is the subject of the current study). In step 702, the HRNet 201 can pair the reference tiles and the selected set of tiles, based on common/similar features/parameters. In step 703, the HRNet 201 can train the Style Transfer model for the study using the pairs. The various actions in method 700 may be performed in the order presented, in a different order or simultaneously. Further, in some embodiments, some actions listed in FIG. 7 may be omitted.
[0040] Kidney glomeruli segmentation model is trained on a set of images (as depicted in the examples in FIGs. 8A-8E). Embodiments herein have used color-based augmentation for training. Normalized tiles of Lab 2 - Lab 8 to WuXi like stain using proposed method. FIG. 9 discloses the dice score for segmentation model with and without stain normalization.
[0041] FIG. 10A depicts an example comparison with the methods on a publicly available mitosis dataset. FIG. 10B depicts a comparison with perceptual similarity measure.
[0042] The embodiments disclosed herein can be implemented through at least one software program running on at least one hardware device and performing network management functions to control the network elements. The network elements include blocks which can be at least one of a hardware device, or a combination of hardware device and software module.
[0043] The foregoing description of the specific embodiments will so fully reveal the general nature of the embodiments herein that others can, by applying current knowledge, readily modify and/or adapt for various applications such specific embodiments without departing from the generic concept, and, therefore, such adaptations and modifications should and are intended to be comprehended within the meaning and range of equivalents of the disclosed embodiments. It is to be understood that the phraseology or terminology employed herein is for the purpose of description and not of limitation. Therefore, while the embodiments herein have been described in terms of embodiments, those skilled in the art will recognize that the embodiments herein can be practiced with modification within the spirit and scope of the embodiments as described herein.
,CLAIMS:STATEMENT OF CLAIMS
We claim:
1. A method for stain normalization of histopathological images, the method comprising
transforming (502), by a modified High Resolution Network (HRNet) (201), at least one input histopathological image;
determining (504), by a pre-trained Convolutional Neural Network (CNN) (202), a content loss and a style loss using the transformed image, the at least one input image and a reference image;
determining (505), by a discriminator (203), an adversarial loss using the transformed image and the reference image;
determining (505), by the discriminator (203), a discriminator loss;
determining (506), by a loss determination module (204), a total stain transfer generator loss using the determined adversarial loss, the determined content loss, the determined discriminator loss and the determined style loss; and
normalizing (508), by the HRNet (201), the at least one input histopathological image using the determined total stain transfer generator loss to a desired stain.
2. The method, as claimed in claim 1, wherein the HRNet (201) learns residual transformation using a direct skip connection, wherein the direct skip connection is from an input of the HRNet (201) to an output of the HRNet (201).
3. The method, as claimed in claim 1, wherein the HRNet (201) directly transfers high-resolution features from layer to layer without downsampling.
4. The method, as claimed in claim 1, wherein the pre-trained CNN (202) is a pre-trained neural network VGG-16/19, which is trained on an ImageNet dataset.
5. The method, as claimed in claim 1, wherein the discriminator (203) distinguishes between the at least one reference images and at least one fake image generated using a stain transfer generator network.
6. The method, as claimed in claim 1, wherein determining, by the pre-trained CNN (202), the style loss comprises computing the style loss from a style representation obtained by passing the reference image and the transformed image through a pre-trained CNN, wherein the style representation is a Gram matrix of CNN feature at each layer and the Gram matrix at a location is given by an inner product of vectorized feature maps.
7. The method, as claimed in claim 1, wherein determining, by the pre-trained CNN (202), the content loss comprises computing the content loss from at least one content feature obtained by passing the at least one input image and the transformed image through a pre-trained CNN.
8. The method, as claimed in claim 1, wherein the adversarial loss is a log of one minus the transformed image.
9. The method, as claimed in claim 1, wherein determining, by the discriminator (203), the discriminator loss comprises calculating the discriminator loss as a log loss between the reference image and the transformed image.
10. The method, as claimed in claim 1, wherein the discriminator (203) iteratively minimizes the total stain transfer generator loss.
11. The method, as claimed in claim 1, wherein the method comprises determining if style transfer is to be performed for a study, based on a distance of a stain of the new study to a reference strain.
12. The method, as claimed in claim 11, wherein performing the style transfer comprises
pairing a set of tiles for an organ under study with a set of reference tiles; and
training a style transfer model using the paired sets of tiles.
13. A system comprising a stain normalization of histopathological images, the system comprising
a modified High Resolution Network (HRNet) (201);
a pre-trained Convolutional Neural Network (CNN) (202);
a discriminator (203); and
a loss determination module (204), wherein the system is configured to
transform at least one input histopathological image;
determine a content loss and a style loss using the transformed image, the at least one input image and a reference image;
determine an adversarial loss using the transformed image and the reference image;
determine a discriminator loss;
determine a total stain transfer generator loss using the determined adversarial loss, the determined content loss, the determined discriminator loss and the determined style loss; and
normalize the at least one input histopathological image using the determined total stain transfer generator loss to a desired stain.
14. The system, as claimed in claim 13, wherein the HRNet (201) is configured to learn residual transformation using a direct skip connection, wherein the direct skip connection is from an input of the HRNet (201) to an output of the HRNet (201).
15. The system, as claimed in claim 13, wherein the HRNet (201) is configured to directly transfer high-resolution features from layer to layer without downsampling.
16. The system, as claimed in claim 13, wherein the pre-trained CNN (202) is a pre-trained neural network VGG-16/19, which is trained on an ImageNet dataset.
17. The system, as claimed in claim 13, wherein the discriminator (203) is configured to distinguish between the at least one reference images and at least one fake image generated using a stain transfer generator network.
18. The system, as claimed in claim 13, wherein the pre-trained CNN (202) is configured to determine the style loss by computing the style loss from a style representation obtained by passing the reference image and the transformed image through a pre-trained CNN, wherein the style representation is a Gram matrix of CNN feature at each layer and the Gram matrix at a location is given by an inner product of vectorized feature maps.
19. The system, as claimed in claim 13, wherein the pre-trained CNN (202) is configured to determine the content loss from at least one content feature obtained by passing the at least one input image and the transformed image through a pre-trained CNN.
20. The system, as claimed in claim 13, wherein the adversarial loss is a log of one minus the transformed image.
21. The system, as claimed in claim 13, wherein the discriminator (203) is configured to determine the discriminator loss as a log loss between the reference image and the transformed image.
22. The system, as claimed in claim 13, wherein the discriminator (203) is configured to iteratively minimize the total stain transfer generator loss.
23. The system, as claimed in claim 13, wherein the system is configured to determine if style transfer is to be performed for a study, based on a distance of a stain of the new study to a reference strain.
24. The system, as claimed in claim 23, wherein the system is configured to perform the style transfer by
pairing a set of tiles for an organ under study with a set of reference tiles; and
training a style transfer model using the paired sets of tiles.

Documents

Application Documents

# Name Date
1 202021031736-ABSTRACT [11-11-2022(online)].pdf 2022-11-11
1 202021031736-STATEMENT OF UNDERTAKING (FORM 3) [24-07-2020(online)].pdf 2020-07-24
2 202021031736-PROVISIONAL SPECIFICATION [24-07-2020(online)].pdf 2020-07-24
2 202021031736-CLAIMS [11-11-2022(online)].pdf 2022-11-11
3 202021031736-PROOF OF RIGHT [24-07-2020(online)].pdf 2020-07-24
3 202021031736-COMPLETE SPECIFICATION [11-11-2022(online)].pdf 2022-11-11
4 202021031736-POWER OF AUTHORITY [24-07-2020(online)].pdf 2020-07-24
4 202021031736-FER_SER_REPLY [11-11-2022(online)].pdf 2022-11-11
5 202021031736-OTHERS [11-11-2022(online)].pdf 2022-11-11
5 202021031736-FORM 1 [24-07-2020(online)].pdf 2020-07-24
6 202021031736-FER.pdf 2022-05-12
6 202021031736-DRAWINGS [24-07-2020(online)].pdf 2020-07-24
7 Abstract1.jpg 2022-01-15
7 202021031736-DECLARATION OF INVENTORSHIP (FORM 5) [24-07-2020(online)].pdf 2020-07-24
8 202021031736-RELEVANT DOCUMENTS [29-06-2021(online)].pdf 2021-06-29
8 202021031736-COMPLETE SPECIFICATION [30-06-2021(online)].pdf 2021-06-30
9 202021031736-POA [29-06-2021(online)].pdf 2021-06-29
9 202021031736-CORRESPONDENCE-OTHERS [30-06-2021(online)].pdf 2021-06-30
10 202021031736-DRAWING [30-06-2021(online)].pdf 2021-06-30
10 202021031736-FORM 13 [29-06-2021(online)].pdf 2021-06-29
11 202021031736-ENDORSEMENT BY INVENTORS [29-06-2021(online)].pdf 2021-06-29
11 202021031736-FORM 18 [30-06-2021(online)].pdf 2021-06-30
12 202021031736-ENDORSEMENT BY INVENTORS [29-06-2021(online)].pdf 2021-06-29
12 202021031736-FORM 18 [30-06-2021(online)].pdf 2021-06-30
13 202021031736-DRAWING [30-06-2021(online)].pdf 2021-06-30
13 202021031736-FORM 13 [29-06-2021(online)].pdf 2021-06-29
14 202021031736-CORRESPONDENCE-OTHERS [30-06-2021(online)].pdf 2021-06-30
14 202021031736-POA [29-06-2021(online)].pdf 2021-06-29
15 202021031736-COMPLETE SPECIFICATION [30-06-2021(online)].pdf 2021-06-30
15 202021031736-RELEVANT DOCUMENTS [29-06-2021(online)].pdf 2021-06-29
16 202021031736-DECLARATION OF INVENTORSHIP (FORM 5) [24-07-2020(online)].pdf 2020-07-24
16 Abstract1.jpg 2022-01-15
17 202021031736-DRAWINGS [24-07-2020(online)].pdf 2020-07-24
17 202021031736-FER.pdf 2022-05-12
18 202021031736-FORM 1 [24-07-2020(online)].pdf 2020-07-24
18 202021031736-OTHERS [11-11-2022(online)].pdf 2022-11-11
19 202021031736-POWER OF AUTHORITY [24-07-2020(online)].pdf 2020-07-24
19 202021031736-FER_SER_REPLY [11-11-2022(online)].pdf 2022-11-11
20 202021031736-PROOF OF RIGHT [24-07-2020(online)].pdf 2020-07-24
20 202021031736-COMPLETE SPECIFICATION [11-11-2022(online)].pdf 2022-11-11
21 202021031736-PROVISIONAL SPECIFICATION [24-07-2020(online)].pdf 2020-07-24
21 202021031736-CLAIMS [11-11-2022(online)].pdf 2022-11-11
22 202021031736-STATEMENT OF UNDERTAKING (FORM 3) [24-07-2020(online)].pdf 2020-07-24
22 202021031736-ABSTRACT [11-11-2022(online)].pdf 2022-11-11

Search Strategy

1 202021031736E_12-05-2022.pdf