Sign In to Follow Application
View All Documents & Correspondence

A System And Method For 3 D Modelling From Scanned Images

Abstract: The invention relates to a system and method of developing a three-dimensional (3D) model of a knee bone from a plurality-of images is disclosed. The method includes selecting one or more images from the plurality of images and identifying"Femur; Tibia and Cortical bone regions in the selected one or more images. The identified images are pre-processed for’Femur, Tibia and Cortical bone segmentation using separate image processing actions respectively. The segmentation of the F emur, Tibia and Cortical bone regions is performed using a separate segmentation process respectively for the F emur, Tibia and Cortical bone regions. The F emur, Tibia and Cortical bone segmented regions are merged and then the merged segmented regions are post processed to develop the three-dimensional (3D) model of the knee bone.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
09 June 2015
Publication Number
52/2016
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
patents@ltts.com
Parent Application
Patent Number
Legal Status
Grant Date
2024-08-29
Renewal Date

Applicants

L&T TECHNOLOGY SERVICES LIMITED
DLF IT SEZ PARK, 2ND FLOOR - BLOCK 3, 1/124, MOUNT POONAMALLEE ROAD, RAMAPURAM, CHENNAI - 600 089,

Inventors

1. HARIKRISHNAN RAMARAJU
L&T TECHNOLOGY SERVICES LTD., MYSORE CAMPUS, KIADB INDUSTRIAL AREA, HEBBAL, HOOTAGALLI, MYSORE 570 018,
2. GINEESH SUKUMARAN
L&T TECHNOLOGY SERVICES LTD., MYSORE CAMPUS, KIADB INDUSTRIAL AREA, HEBBAL, HOOTAGALLI, MYSORE 570 018,
3. BHARATH SHIVAPURAM
L&T TECHNOLOGY SERVICES LTD., MYSORE CAMPUS, KIADB INDUSTRIAL AREA, HEBBAL, HOOTAGALLI, MYSORE 570 018,
4. PARAM RAJPURA
L&T TECHNOLOGY SERVICES LTD., MYSORE CAMPUS, KIADB INDUSTRIAL AREA, HEBBAL, HOOTAGALLI, MYSORE 570 018,

Specification

FIELD OF INVENTION
The invention generally relates to a system and method for 3D modelling and specifically 3D modelling from scanned images.
BACKGROUND
Arthroplasty is a surgical procedure to restore the integrity and function of a joint. It typically involves an orthopedic surgery where the articular surface of a musculoskeletal joint is replaced, remodeled, or realigned by osteotomy or some other procedure. The replacement includes using artificial man-made pieces. The arthroplasty process involves studying the damage part of the bones by studying X-ray/ MRJ / CT scan images, developing/designing replacement bone component based on the information extracted from the X-ray/ MRI / CT scan images and replacing the same by surgical procedure.
The designing or the development of the replacement bone component is typically performed by manually studying the X-ray/ MRI / CT scan images that leaves a lot of scope of error. Hence in most of the cases, surgeons have to perform alterations either on the replacement components or on the bones to adjust the components perfectly. In fact in some cases the joint do not work properly even after doing alterations. Also if the replacement part is made up of metal then typically alteration is performed on bone part that results in unnecessary removal of a healthy bone part.
Some solutions are known that may study the images by processing the images and may develop component with better precision. However most of these solutions depend on the

quality of images received from the scan. Moreover identifying the bone region is currently done by manual intervention. This is laborious, time consuming and prone to error.
Hence there is a need for a better method and system for designing 3D models which will aid in surgical procedures. The present invention is directed to overcoming one or more of the problems as set forth above.
SUMMARY OF THE INVENTION
According to embodiments of the invention, a system and method of developing a three-dimensional (3D) model of a knee bone from a plurality of images is disclosed. The disclosed method includes selecting one or more images from the plurality of images, identifying Femur, Tibia and Cortical bone regions in the selected one or more images, pre-processing the identified images for Femur, Tibia and Cortical bone segmentation using separate image processing actions respectively, segmenting the Femur, Tibia and Cortical bone regions using a separate segmentation process respectively for the Femur, Tibia and Cortical bone regions, merging the Femur, Tibia and Cortical bone segmented regions and post processing the merged segmented regions to develop the three-dimensional (3D) model of the knee bone.
BRIEF DESCRIPTION OF DRAWINGS
Other objects, features, and advantages of the invention will be apparent from the following description when read with reference to the accompanying drawings. In the drawings, wherein like reference numerals denote corresponding parts throughout the several views:

Figure 1 illustrates a block diagram of a process flow for creating a 3D model of the knee bone
from a set of scanned images according to exemplary embodiments of the invention;
Figure 2 illustrates a block diagram of a process flow for image processing of Femur bone
regions;
Figure 3 illustrates a block diagram of a process flow for image processing of Tibia bone
regions;
Figure 4 illustrates a block diagram of a process flow for image processing of Cortical bone
regions; and
Figure 5 illustrates an exemplary system for creating a 3D model of the knee bone from a set
of scanned images, according to one embodiment of the invention.
DETAILED DESCRIPTION OF DRAWINGS
The following description with reference to the accompanying drawings is provided to assist in a comprehensive understanding of exemplary embodiments of the invention as defined by the claims and their equivalents. It includes various specific details to assist in that understanding but these are to be regarded as merely exemplary. Accordingly, those of ordinary skill in the art will recognize that various changes and modifications of the embodiments described herein can be made without departing from the scope and spirit of the invention. In addition, descriptions of well-known functions and constructions are omitted for clarity and conciseness.
According to embodiments of the invention, a system and method of creating a 3D model of the bone that may be used to design a custom orthopaedic implant from a set of MRI / CT scanned images is disclosed. According to an embodiment, the bone is a knee bone..

Figure 1 illustrates a block diagram of a process 100 for developing a three-dimensional (3D) model of a knee bone from scanned images according to embodiments of the invention.
At step 102, one or more images may be selected from a plurality of images. The plurality of images may be magnetic resonance imaging (MRI) images or computed tomography (CT) images. According to an embodiment, the MRI images may be directly obtained from an MRI instrument or with a DICOM reader. According to another embodiment, the CT images may be obtained from a CT scan instrument or with a DICOM reader. According to yet another embodiment, the MRI or CT images may be obtained from a repository containing a plurality of MRI images or CT images. According to another embodiment, the images may comply with the Digital Imaging and Communications in Medicine (DICOM) standards.
At step 104, Femur, Tibia and Cortical bone regions in the selected one or more images may be identified. The Femur, Tibia and Cortical bone regions may be identified based on at least one seed selection in the selected one or more images. The seed selection is the starting point for further processing. According to an embodiment, the seed may be selected using any input device such as, but not limited to, computer mouse, digital pen etc.
Once the Femur, Tibia and Cortical bone regions are identified, separate image processing may be used for Femur, Tibia and Cortical bone regions at step 106, 108 and 110 respectively.
At step 106, the image processing for the Femur bone regions may be performed. Figure 2 illustrates exemplary steps involved in image processing of the Femur bone regions. According

to an embodiment, the image processing step 106 of the Femur bone regions may include a pre-processing step 200, a segmentation step 210 and a post-processing step 212.
According to an embodiment, the pre-processing step 200 for Femur bone region may include an image smoothing step 202, a sharpening step 204, an edge enhancement step 206 or/and an intensity transformations step 208. According to an embodiment, the illustrated pre-processing steps 200 that is image smoothing step 202, sharpening step 204, edge enhancement step 206 or/and intensity transformations may be performed in any sequence.
The segmentation step 210 for Femur bone regions may be performed after the pre-processing step 200. According to one embodiment, the segmentation step 210 for Femur bone regions may include an edge based Active Contour segmentation process.
The post processing step 212 of the segmented images for Femur bone regions may be carried out after segmentation step 210 of the Femur bone regions. According to one embodiment, the post processing 212 of the segmented images for Femur bone regions may be based on a threshold value. According to an exemplary embodiment, the threshold value used may be a global threshold that is a single value threshold may be applied on the entire image.

According to an exemplary embodiment, the pre-processing step 300 of the Tibia bone regions may include an image smoothing step 302, a sharpening step 304, a contrast enhancement step 306 and/or an intensity transformations step 308. According to an embodiment, the illustrated pre-processing steps 300 that are image smoothing step 302, sharpening step 304, contrast enhancement step 306 and/or intensity transformations step 308 may be performed in any sequence.
The segmentation step 310 for Tibia bone regions may be performed after the pre-processing step 300. According to an exemplary embodiment, the segmentation step 310 for Tibia bone regions may include a region based Active Contour segmentation process.
The post processing step 312 for Tibia bone regions may be carried out after segmentation step 310 on the segmented image of the Tibia bone region. According to an exemplary embodiment, the post processing step 312 for Tibia bone regions may be based on a threshold value. According to another exemplary embodiment, the threshold value used may be a global threshold that is a single value threshold may be applied on the entire image.
At step 110, the image processing for the Cortical bone regions may be performed. Figure 4 illustrates exemplary steps involved in the image processing of the Cortical bone regions. The image processing of the Cortical bone regions may include a pre-processing step 400, a segmentation step 402 and a post-processing step 404.
According to an embodiment, the pre-processing step 400 of Cortical bone regions may include an image smoothing process.

The segmentation step 402 for Cortical bone regions may be performed after the pre-processing step 400. According to an exemplary embodiment, the segmentation step 402 may include a region based Local Auto Threshold based segmentation process.
The post processing step 404 of the segmented images for Cortical bone regions may be carried out after segmentation step 402 of the Cortical bone regions. According to an exemplary embodiment, the post processing 404 of the segmented images for Cortical bone regions may include a filtering process.
According to an embodiment, a user may manually set the parameters for segmentation process of the Femur, Tibia and Cortical bone regions.
Referring back to figure 1, at step 112, the Femur bone, Tibia bone and Cortical bone segmented regions may be merged after subjecting to separate image processing as illustrated in figure 2, figure 3 and figure 4. According to an embodiment, the merging of the segmented bones may be automatic and may not require user input.
At step 114, the merged segmented regions may be post-processed to develop a three-dimensional (3D) model of a knee bone. According to an embodiment, a user may edit the 3D model by adding an appropriate region or removing an inappropriate region using drawing tools for making minor corrections.
Figure 5 illustrates an exemplary system 500 for creating a 3D model of the bone from a set of scanned images, according to one embodiment of the invention. The scanned images may be stored in a repository 502. According to an embodiment, the repository may contain MRI

images or CT images. The system may include a processor 504 for creating a 3D model of the bone from a set of scanned images. One or more images may be selected from the repository and provided as input to the processor 504.
The processor 504 may select scanned images from the repository 502 and may identify Femur, Tibia and Cortical bone regions in the selected scanned images based on at least one seed selection in the scanned images. According to one embodiment, the seed may be selected using any input device such as, but not limited to, computer mouse, digital pen etc.
The processor 504 has a pre-processing module 506, segmentation module 514 and post-processing module 522.
The pre-processing module 506 may perform pre-processing of the selected images for segmentation of the identified bone regions. According to an exemplary embodiment, the pre¬processing module 506 may have a sub module 508 for pre-processing Femur bone regions, a sub module 510 for pre-processing Tibia bone regions and a sub module 512 for pre-processing Cortical bone regions.
The pre-processing of Femur bone regions may be performed in sub module 508. According to an embodiment, the sub module 508 may perform image processing actions including image smoothing, sharpening, edge enhancement and intensity transformations. The pre-processing of Tibia bone regions may be performed in sub module 510. According to another embodiment, the sub module 510 may perform image processing actions including image smoothing, sharpening, contrast enhancement and intensity transformations. Furthermore, the pre-processing of Cortical bone regions may be performed in sub module 508. According to

an embodiment, the sub module 508 may perform image processing actions including image smoothing.
The segmentation module 514 may perform segmentation process for the Femur, Tibia and Cortical bone regions. According to an exemplary embodiment, the segmentation module 514 may have a sub module 516 for performing segmentation process of the Femur bone regions, a sub module 518 for performing segmentation process of the Tibia bone regions and a sub module 520 for performing segmentation process of the Cortical bone regions.
The segmentation process of Femur bone regions may be performed in sub module 516. According to an embodiment, the sub module 516 may perform an edge based Active Contour segmentation process. The segmentation process for Tibia bone regions may be performed in sub module 518. According to another embodiment, the sub module 518 may perform a region based Active Contour segmentation process. The segmentation process for Cortical bone regions may be performed in sub module 520. According to yet another embodiment, the siib module 520 may perform a Local Auto Threshold based segmentation process. Furthermore, according to an embodiment, a user may manually set the parameters for segmentation of Femur, Tibia and Cortical bone regions.
The post-processing module 522 may perform post processing of the segmented images for Femur, Tibia and Cortical bone regions. According to an exemplary embodiment, the post¬processing module 522 may have a sub module 524 for post-processing Femur bone regions, a sub module 526 for post-processing Tibia bone regions and a sub module 528 for post processing Cortical bone regions.

The post-processing of the segmented images of the Femur arid Tibia bone regions may be performed in sub modules 524 and 526 respectively. According to one embodiment, post processing of the segmented images for Femur and Tibia bone regions may be based on a threshold value. According to another embodiment, the threshold value used may be a global threshold i.e. single value threshold is applied on the entire image.
The sub module 528 may perform a post processing of the segmented images of the Cortical bone regions. According to an embodiment, the post processing of the segmented images for Cortical bone regions may include filtering technique.
The processor 504 may further perform merging of Femur bone, Tibia bone and Cortical bone segmented regions and post-processing of the merged segmented regions to generate a 3D model of the knee bone.
The processor 504 may generate output in the form of a 3D model of a knee bone on a display device 530. According to an embodiment, the user may edit the 3D model on the display device 530 by adding an appropriate region or removing an inappropriate region using drawing tools for making minor corrections. According to one embodiment, the display device may be any display such as but not limited to Cathode ray tube display (CRT), Light-emitting diode display (LED), Electroluminescent display (ELD), Plasma display panel (PDP) etc. According to another embodiment, the display may include a graphical user interface (GUI).
In the drawings and specification there has been set forth preferred embodiments of the invention, and although specific terms are employed, these are used in a generic and descriptive sense only and not for purposes of limitation. Changes in the form and the proportion of parts,

as well as in the substitution of equivalents, are contemplated as circumstances may suggest or render expedient without departing from the spirit or scope of the invention.
Throughout the various contexts described in this disclosure, the embodiments of the invention further encompass computer apparatus, computing systems and machine-readable media configured to carry out the foregoing systems and methods. In addition to an embodiment consisting of specifically designed integrated circuits or other electronics, the present invention may be conveniently implemented using a conventional general purpose or a specialized digital computer or microprocessor programmed according to the teachings of the present disclosure, as will be apparent to those skilled in the computer art.
Appropriate software coding can readily be prepared by skilled programmers based on the teachings of the present disclosure, as will be apparent to those skilled in the software art. The invention may also be implemented by the preparation of application specific integrated circuits or by interconnecting an appropriate network of conventional component circuits, as will be readily apparent to those skilled in the art.

We Claim:
1. A method of developing a three-dimensional (3D) model of a knee bone from a
plurality of images, the method comprises:
selecting one or more images from the plurality of images;
identifying Femur, Tibia and Cortical bone regions in the selected one or more images;
pre-processing the selected one or more images for Femur bone segmentation, Tibia bone segmentation and Cortical bone segmentation using separate image processing actions respectively for the Femur, Tibia and Cortical bone regions;
segmenting the Femur, Tibia and Cortical bone regions using a separate segmentation process respectively for the Femur, Tibia and Cortical bone regions;
merging the Femur, Tibia and Cortical bone segmented regions; and
post processing the merged segmented regions to develop the three-dimensional (3D) model of the knee bone.
2. The method as claimed in claim 1, wherein the Femur, Tibia and Cortical bone regions are identified based on at least one seed selection in the selected one or more images.
3. The method as claimed in claim 1, wherein the pre-processing of Femur bone regions segmentation includes image smoothing, sharpening, edge enhancement and intensity transformations.
4. The method as claimed in claim 1, wherein the pre-processing of Tibia bone regions segmentation includes image smoothing, sharpening, contrast enhancement and intensity transformations.

5. The method as claimed in claim 1, wherein the pre-processing of Cortical bone regions segmentation includes image smoothing.
6. The method as claimed in claim 1, wherein the segmentation process for Femur bone regions includes edge based Active Contour segmentation.
7. The method as claimed in claim 1, wherein the segmentation process for Tibia bone regions includes region based Active Contour segmentation.
8. The method as claimed in claim 1, wherein the segmentation process for Cortical bone regions includes Auto Local Threshold based segmentation.
9. The method as claimed in claim 1, further comprising post processing after the segmentation process for the cortical bone regions, the post processing including filtering technique.
10. The method as claimed in claim 1, optionally comprises a module for manually modifying one or more images before generating the 3D model.

Documents

Application Documents

# Name Date
1 2867-CHE-2015 FORM-5 09-06-2015.pdf 2015-06-09
1 2867-CHE-2015-IntimationOfGrant29-08-2024.pdf 2024-08-29
2 2867-CHE-2015 FORM-3 09-06-2015.pdf 2015-06-09
2 2867-CHE-2015-PatentCertificate29-08-2024.pdf 2024-08-29
3 2867-CHE-2015-PETITION UNDER RULE 137 [19-04-2024(online)].pdf 2024-04-19
3 2867-CHE-2015 FORM-2 09-06-2015.pdf 2015-06-09
4 2867-CHE-2015-Written submissions and relevant documents [19-04-2024(online)].pdf 2024-04-19
4 2867-CHE-2015 FORM-1 09-06-2015.pdf 2015-06-09
5 2867-CHE-2015-Correspondence to notify the Controller [02-04-2024(online)].pdf 2024-04-02
5 2867-CHE-2015 DRAWINGS 09-06-2015.pdf 2015-06-09
6 2867-CHE-2015-FORM-26 [02-04-2024(online)].pdf 2024-04-02
6 2867-CHE-2015 DESCRIPTION (PROVISIONAL) 09-06-2015.pdf 2015-06-09
7 2867-CHE-2015-US(14)-HearingNotice-(HearingDate-10-04-2024).pdf 2024-03-27
7 2867-CHE-2015 CORRESPONDENCE OTHERS 09-06-2015.pdf 2015-06-09
8 2867-CHE-2015-Form 2(Title Page)-090616.pdf 2016-07-25
8 2867-CHE-2015-Correspondence_Request For Update Mail ID_30-06-2022.pdf 2022-06-30
9 2867-CHE-2015-Covering Letter [27-05-2022(online)].pdf 2022-05-27
9 2867-CHE-2015-Drawing-090616.pdf 2016-07-25
10 2867-CHE-2015-Description(Complete)-090616.pdf 2016-07-25
10 2867-CHE-2015-PETITION u-r 6(6) [27-05-2022(online)].pdf 2022-05-27
11 2867-CHE-2015-Correspondence-Abstract-Claims-Description-Drawing-F2-090616.pdf 2016-07-25
11 2867-CHE-2015-Correspondence_Amend the email addresses_14-12-2021.pdf 2021-12-14
12 2867-CHE-2015-CLAIMS [06-12-2021(online)].pdf 2021-12-06
12 2867-CHE-2015-Claims-090616.pdf 2016-07-25
13 2867-CHE-2015-Abstract-090616.pdf 2016-07-25
13 2867-CHE-2015-CORRESPONDENCE [06-12-2021(online)].pdf 2021-12-06
14 2867-CHE-2015-DRAWING [06-12-2021(online)].pdf 2021-12-06
14 2867-CHE-2015-Form 1-030816.pdf 2016-08-09
15 2867-CHE-2015-Correspondence-F1-030816.pdf 2016-08-09
15 2867-CHE-2015-FER_SER_REPLY [06-12-2021(online)].pdf 2021-12-06
16 2867-CHE-2015-OTHERS [06-12-2021(online)].pdf 2021-12-06
16 Form3_After Filing_23-11-2017.pdf 2017-11-23
17 Correspondence by Applicant_Form3_23-11-2017.pdf 2017-11-23
17 2867-CHE-2015-FER.pdf 2021-10-17
18 Correspondence by Applicant _Form 18_31-05-2019.pdf 2019-05-31
18 Form18_Normal Request_31-05-2019.pdf 2019-05-31
19 Correspondence by Applicant _Form 18_31-05-2019.pdf 2019-05-31
19 Form18_Normal Request_31-05-2019.pdf 2019-05-31
20 2867-CHE-2015-FER.pdf 2021-10-17
20 Correspondence by Applicant_Form3_23-11-2017.pdf 2017-11-23
21 2867-CHE-2015-OTHERS [06-12-2021(online)].pdf 2021-12-06
21 Form3_After Filing_23-11-2017.pdf 2017-11-23
22 2867-CHE-2015-Correspondence-F1-030816.pdf 2016-08-09
22 2867-CHE-2015-FER_SER_REPLY [06-12-2021(online)].pdf 2021-12-06
23 2867-CHE-2015-Form 1-030816.pdf 2016-08-09
23 2867-CHE-2015-DRAWING [06-12-2021(online)].pdf 2021-12-06
24 2867-CHE-2015-Abstract-090616.pdf 2016-07-25
24 2867-CHE-2015-CORRESPONDENCE [06-12-2021(online)].pdf 2021-12-06
25 2867-CHE-2015-CLAIMS [06-12-2021(online)].pdf 2021-12-06
25 2867-CHE-2015-Claims-090616.pdf 2016-07-25
26 2867-CHE-2015-Correspondence-Abstract-Claims-Description-Drawing-F2-090616.pdf 2016-07-25
26 2867-CHE-2015-Correspondence_Amend the email addresses_14-12-2021.pdf 2021-12-14
27 2867-CHE-2015-Description(Complete)-090616.pdf 2016-07-25
27 2867-CHE-2015-PETITION u-r 6(6) [27-05-2022(online)].pdf 2022-05-27
28 2867-CHE-2015-Covering Letter [27-05-2022(online)].pdf 2022-05-27
28 2867-CHE-2015-Drawing-090616.pdf 2016-07-25
29 2867-CHE-2015-Correspondence_Request For Update Mail ID_30-06-2022.pdf 2022-06-30
29 2867-CHE-2015-Form 2(Title Page)-090616.pdf 2016-07-25
30 2867-CHE-2015-US(14)-HearingNotice-(HearingDate-10-04-2024).pdf 2024-03-27
30 2867-CHE-2015 CORRESPONDENCE OTHERS 09-06-2015.pdf 2015-06-09
31 2867-CHE-2015-FORM-26 [02-04-2024(online)].pdf 2024-04-02
31 2867-CHE-2015 DESCRIPTION (PROVISIONAL) 09-06-2015.pdf 2015-06-09
32 2867-CHE-2015-Correspondence to notify the Controller [02-04-2024(online)].pdf 2024-04-02
32 2867-CHE-2015 DRAWINGS 09-06-2015.pdf 2015-06-09
33 2867-CHE-2015-Written submissions and relevant documents [19-04-2024(online)].pdf 2024-04-19
33 2867-CHE-2015 FORM-1 09-06-2015.pdf 2015-06-09
34 2867-CHE-2015-PETITION UNDER RULE 137 [19-04-2024(online)].pdf 2024-04-19
34 2867-CHE-2015 FORM-2 09-06-2015.pdf 2015-06-09
35 2867-CHE-2015-PatentCertificate29-08-2024.pdf 2024-08-29
35 2867-CHE-2015 FORM-3 09-06-2015.pdf 2015-06-09
36 2867-CHE-2015 FORM-5 09-06-2015.pdf 2015-06-09
36 2867-CHE-2015-IntimationOfGrant29-08-2024.pdf 2024-08-29

Search Strategy

1 2021-04-0814-25-36E_08-04-2021.pdf

ERegister / Renewals

3rd: 19 Nov 2024

From 09/06/2017 - To 09/06/2018

4th: 19 Nov 2024

From 09/06/2018 - To 09/06/2019

5th: 19 Nov 2024

From 09/06/2019 - To 09/06/2020

6th: 19 Nov 2024

From 09/06/2020 - To 09/06/2021

7th: 19 Nov 2024

From 09/06/2021 - To 09/06/2022

8th: 19 Nov 2024

From 09/06/2022 - To 09/06/2023

9th: 19 Nov 2024

From 09/06/2023 - To 09/06/2024

10th: 19 Nov 2024

From 09/06/2024 - To 09/06/2025

11th: 19 Nov 2024

From 09/06/2025 - To 09/06/2026