Sign In to Follow Application
View All Documents & Correspondence

A System For Computer Assisted Surgery And A Method Thereof

Abstract: A system for computer assisted surgery is provided. The system includes a patient tracker positioned between a patient and an image capturing device, wherein the patient tracker comprises a unique pattern. The image capturing device determines co-ordinates by reading the unique pattern on the patient tracker and the navigation probe. A control unit is configured to determine a position of the patient, via the patient tracker, using predefined set of instructions and determine one or more registration points using predefined data of the patient. The navigation probe matches the one or more registration points on the patient with reference to the patient tracker. A tracker-less navigation module utilises the image capturing device as a reference and utilizes last co-ordinates to resume surgical navigation upon the image capturing device losing sight of the patient tracker. FIG. 1

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
19 November 2019
Publication Number
01/2020
Publication Type
INA
Invention Field
BIO-MEDICAL ENGINEERING
Status
Email
filings@ipexcel.com
Parent Application

Applicants

HAPPY RELIABLE SURGERIES PVT. LTD
#752, 8th Main, Mahalakshmi Layout, Bangalore-560086, Karnataka, India

Inventors

1. ARPIT PALIWAL
Happy Reliable Surgeries Pvt. Ltd; #752, 8th Main, Mahalakshmi Layout, Bangalore-560086, Karnataka, India

Specification

Claims:1. A system for computer assisted surgery comprising:
a patient tracker (102) positioned between a patient and an image capturing device (104), wherein the patient tracker (102) comprises a unique pattern;
the image capturing device (104) configured to determine co-ordinates by reading the unique pattern on the patient tracker (102);
a control unit is operatively coupled to the image capturing device (104), wherein the control unit is configured to:
determine a position of the patient, via the patient tracker (102), using predefined set of instructions;
determine one or more registration points using predefined data of the patient;
at least one navigation probe (106) operatively coupled to the control unit, wherein the at least one navigation probe (106), including a unique pattern, is configured to match the one or more registration points on the patient with reference to the patient tracker (102); and
a tracker-less navigation module operatively coupled to the control unit, wherein the tracker-less navigation module is configured to utilise the image capturing device (104) as a reference and utilizes last co-ordinates to resume surgical navigation upon the image capturing device (104) losing sight of the patient tracker (102).
2. The system as claimed in claim 1, wherein the patient tracker (102) is placed within a predefined distance of the patient.
3. The system as claimed in claim 1, wherein patient tracker (102) is required to be visible to the image capturing device (104) in order for the patient tracker (102) to simultaneously move with the patient.
4. The system as claimed in claim 1, wherein the patient tracker (102) is utilized as a reference by the image capturing device (104) to determine the position of the patient.
5. The system as claimed in claim 1, wherein the predefined data comprises computed tomography and magnetic resonance imaging data of the patient.
6. A method (300) comprising:
positioning a patient tracker (102) between a patient and an image capturing device (104), wherein the patient tracker (102) comprises a unique pattern;
determining, by the image capturing device (104), co-ordinates by reading the unique pattern on the patient tracker (102);
determining, by a control unit, a position of the patient, via the patient tracker (102), using predefined set of instructions;
determining, by the control unit, one or more registration points using predefined data of the patient;
matching, by at least one navigation probe (106), the one or more registration points on the patient with reference to the patient tracker (102);
utilising, by a tracker-less navigation module, the image capturing device (104) as a reference; and
utilizing, by the tracker-less navigational module, last co-ordinates to resume surgical navigation upon the image capturing device (104) losing sight of the patient tracker (102).
7. The method (300) as claimed in claim 6, further comprising placing the patient tracker (102) within a predefined distance of the patient.
8. The method (300) as claimed in claim 6, further comprising utilizing the patient tracker (102) as a reference by the image capturing device (104) to determine the position of the patient.

Dated this 19th day of November 2019

Signature

Vidya Bhaskar Singh Nandiyal
Patent Agent (IN/PA-2912)
Agent for the Applicant

, Description:FIELD OF THE INVENTION
[0001] Embodiments of the present disclosure relate to methods, processes, apparatus, and systems for adjustable configurations of a tracking arrangement for a navigated surgical tool, and more particularly to, a system for computer assisted surgery and a method thereof.
BACKGROUND
[0002] Optical navigation, as well as other navigation, is used in surgery to track a rigid body's location in space in relation to a tool. These systems often rely upon the use of a camera and markers, the positions of which are tracked by the camera as discuss further hereinbelow. Accordingly, using a known special relationship of the markers on the image frame, the 3D position of the tool in relation to the rigid body can be known while the camera can sense the markers. Display software may further be used to display the 3D position of the tool in relation the rigid body so that a virtual, real-time image of the tool and the surrounding anatomy of the patient may be made available to the surgeon to aid in the surgery.
[0003] Since the navigation system is optical, all trackers must remain within the sensing range of the camera during a surgical procedure to avoid complications. If a tracker is physically blocked or moves out of camera's view, the real-time tracking will stop until the tracker is moved back into the workspace sensing range (i.e., back into the camera's view). This creates problems during surgery if the tracked tool and/or objects are not detected and may require repositioning where the tracker is not visible to the camera.
[0004] Therefore, there is a requirement for a system that can overcome the aforementioned issues.
BRIEF DESCRIPTION
[0005] In accordance with one embodiment of the disclosure, a system for computer assisted surgery is provided. The system includes a patient tracker positioned between a patient and an image capturing device, wherein the patient tracker comprises a unique pattern. The image capturing device configured to determine co-ordinates by reading the unique pattern on the patient tracker.
[0006] The system includes a control unit operatively coupled to the image capturing device, wherein the control unit is configured to determine a position of the patient, via the patient tracker, using predefined set of instructions and determine one or more registration points using predefined data of the patient.
[0007] The system also includes at least one navigation probe operatively coupled to the control unit, wherein the at least one navigation probe, including a unique pattern, is configured to match the one or more registration points on the patient with reference to the patient tracker.
[0008] The system also includes a tracker-less navigation module operatively coupled to the control unit, wherein the tracker-less navigation module is configured to utilise the image capturing device as a reference and utilizes last co-ordinates to resume surgical navigation upon the image capturing device losing sight of the patient tracker.
[0009] In accordance with another embodiment of the disclosure, a method thereof is provided. The method includes positioning a patient tracker between a patient and an image capturing device, wherein the patient tracker comprises a unique pattern. The method also includes determining, by the image capturing device, co-ordinates by reading the unique pattern on the patient tracker.
[0010] The method also includes determining, by a control unit, a position of the patient, via the patient tracker, using predefined set of instructions; and determining, by the control unit, one or more registration points using predefined data of the patient.
[0011] The method also includes matching, by at least one navigation probe, the one or more registration points on the patient with reference to the patient tracker.
[0012] The method also includes utilising, by a tracker-less navigation module, the image capturing device as a reference; and utilizing, by the tracker-less navigational module, last co-ordinates to resume surgical navigation upon the image capturing device losing sight of the patient tracker.
[0013] In a further embodiment of the disclosed method, the method also includes additional steps of placing the patient tracker within a predefined distance of the patient.
[0014] In a further embodiment of the disclosed method, the method also includes additional steps of utilizing the patient tracker as a reference by the image capturing device to determine the position of the patient.
[0015] To further clarify the advantages and features of the present disclosure, a more particular description of the disclosure will follow by reference to specific embodiments thereof, which are illustrated in the appended figures. It is to be appreciated that these figures depict only typical embodiments of the disclosure and are therefore not to be considered limiting in scope. The disclosure will be described and explained with additional specificity and detail with the appended figures.
BRIEF DESCRIPTION OF THE DRAWINGS
The disclosure will be described and explained with additional specificity and detail with the accompanying figures in which:
[0016] FIG. 1 illustrates a pictorial depiction of a system for computer assisted surgery, wherein the patient tracker is visible to the image capturing device in accordance with an embodiment of the present disclosure;
[0017] FIG. 2 illustrates a pictorial depiction of a system for computer assisted surgery, wherein the patient tracker is not visible to the image capturing device in accordance with an embodiment of the present disclosure; and
[0018] FIG. 3 illustrates a flow chart representing steps involved in a method for FIG. 1 in accordance with an embodiment of the present disclosure.
[0019] Further, those skilled in the art will appreciate that elements in the figures are illustrated for simplicity and may not have necessarily been drawn to scale. Furthermore, in terms of the construction of the device, one or more components of the device may have been represented in the figures by conventional symbols, and the figures may show only those specific details that are pertinent to understanding the embodiments of the present disclosure so as not to obscure the figures with details that will be readily apparent to those skilled in the art having the benefit of the description herein.
DETAILED DESCRIPTION
[0020] For the purpose of promoting an understanding of the principles of the disclosure, reference will now be made to the embodiment illustrated in the figures and specific language will be used to describe them. It will nevertheless be understood that no limitation of the scope of the disclosure is thereby intended. Such alterations and further modifications in the illustrated system, and such further applications of the principles of the disclosure as would normally occur to those skilled in the art are to be construed as being within the scope of the present disclosure.
[0021] The terms "comprises", "comprising", or any other variations thereof, are intended to cover a non-exclusive inclusion, such that a process or method that comprises a list of steps does not include only those steps but may include other steps not expressly listed or inherent to such a process or method. Similarly, one or more devices or sub-systems or elements or structures or components preceded by "comprises... a" does not, without more constraints, preclude the existence of other devices, sub-systems, elements, structures, components, additional devices, additional sub-systems, additional elements, additional structures or additional components. Appearances of the phrase "in an embodiment", "in another embodiment" and similar language throughout this specification may, but not necessarily do, all refer to the same embodiment.
[0022] Unless otherwise defined, all technical and scientific terms used herein have the same meaning as commonly understood by those skilled in the art to which this disclosure belongs. The system, methods, and examples provided herein are only illustrative and not intended to be limiting.
[0023] In the following specification and the claims, reference will be made to a number of terms, which shall be defined to have the following meanings. The singular forms “a”, “an”, and “the” include plural references unless the context clearly dictates otherwise.
[0024] FIG. 1 illustrates a pictorial depiction of a system for computed assisted surgery when a patient tracker (102) is visible to the image capturing device (104) in accordance with an embodiment of the present disclosure.
[0025] The system includes a patient tracker (102), an image capturing device (104), a control unit (not shown FIG.1), at least one navigation probe (106) and a tracker-less navigation module (not shown in FIG. 1). As shown in FIG. 1, the patient tracker is not blocked by a medical practitioner.
[0026] The patient tracker (102) is positioned between a patient and the image capturing device (104), wherein the patient tracker (102) is coupled to the patient. The patient tracker (102) includes a unique pattern. The at least one navigation probe (106) includes a unique pattern.
[0027] The image capturing device (104) is configured to track the patient tracker (102), as the image capturing device (104) cannot track the patient. The image capturing device (104) is configured to read the unique pattern on the patient tracker (102) and the at least one navigation probe (106), wherein the image capturing device (104) considers the patient tracker (102) as a reference.
[0028] Upon reading the unique pattern of the patient tracker (102) and the at least one navigation probe (106), the image capturing device (104) determines coordinates of the patient and the at least one navigation probe (106) from the unique patterns by using predefined set of instructions.
[0029] In one embodiment, the control unit uses predefined data of the patient to determine one or more registration points. In such an embodiment, the patient undergoes patient calibration, wherein the patient is marked with one or more registration points using predefined data of the patient using the at least one navigation probe (106) with reference to the patient tracker (102). In one embodiment, the predefined data of the patient is, including but not limited to, computed tomography and magnetic resonance imaging data of the patient.
[0030] Using the predefined data of the patient, along with the image capturing device (104) and the predefined set of instructions, the accuracy is checked, and surgical navigation process is initiated. To maintain the location or position of the patient, the patient tracker (102) needs to be visible to the image capturing device (104) throughout the surgery.
[0031] FIG. 2 illustrates a pictorial depiction of a system for computed assisted surgery when the patient tracker (102) is not visible to the image capturing device (104) in accordance with an embodiment of the present disclosure.
[0032] As shown in FIG. 2, the patient tracker (102) may not be visible to the image capturing device (104) to read the unique patterns and determine the position of the patient undergoing the surgery. In one such embodiment, the control unit is configured to read co-ordinates of the image capturing device (104) at every frame and interpret the values of every frame. In one embodiment, when the patient tracker (102) is not in the sight of the image capturing device (104), the tracker-less navigation module automatically uses the last co-ordinates received and makes the image capturing device (104) as a reference and resumes with the surgical navigation procedure.
[0033] FIG. 3 illustrates a flow chart representing steps involved in a method (300) of FIG. 1 in accordance with an embodiment of the present disclosure.
[0034] The method (300) includes positioning a patient tracker (102) between a patient and an image capturing device (104), in step 302. The method includes, positioning the patient tracker (102) between the patient and the image capturing device (104) wherein the patient tracker (102) and the at least one navigation probe (106) includes a unique pattern, respectively.
[0035] The method (300) includes determining co-ordinates by reading the unique pattern on the patient tracker (102), in step 304. The method includes determining, by the image capturing device (104), the co-ordinates by reading the unique pattern on the patient tracker (102). The image capturing device (104) is configured to track the patient tracker (102), as the image capturing device (104) cannot track the patient. The image capturing device (104) is configured to read the unique pattern on the patient tracker (102) and the at least one navigation probe (106), wherein the image capturing device (104) considers the patient tracker (102) as a reference. Upon reading the unique pattern of the patient tracker (102) and the at least one navigation probe (106), the image capturing device (104) determines coordinates of the patient and the at least one navigation probe (106) from the unique patterns by using predefined set of instructions.
[0036] The method (300) includes determining a position of the patient, via the patient tracker (102), using predefined set of instructions, in step 306. The method (300) includes determining, by a control unit, the position of the patient, via the patient tracker (102), using the predefined set of instructions.
[0037] The method (300) includes determining one or more registration points using predefined data of the patient, in step 308. The method (300) includes determining, by the control unit, the one or more registration points using the predefined data of the patient. In one embodiment, the control unit uses predefined data of the patient to determine one or more registration points. In such an embodiment, the patient undergoes patient calibration, wherein the patient is marked with one or more registration points using predefined data of the patient using the at least one navigation probe (106) with reference to the patient tracker (102). In one embodiment, the predefined data of the patient is, including but not limited to, computed tomography and magnetic resonance imaging data of the patient.
[0038] The method (300) includes matching, by at least one navigation probe (106), the one or more registration points on the patient with reference to the patient tracker (102), in step 310. In such an embodiment, the patient undergoes patient calibration, wherein the patient is marked with one or more registration points using predefined data of the patient using the at least one navigation probe (106) with reference to the patient tracker (102).
[0039] The method (300) includes utilising, by a tracker-less navigation module, the image capturing device (104) as a reference, in step 312. When the patient tracker (102) is not visible to the image capturing device (104), then the tracker-less navigation module is configured to utilize the image capturing device (104) as the reference.
[0040] The method (300) includes utilizing, by the tracker-less navigational module, last co-ordinates to resume surgical navigation upon the image capturing device (104) losing sight of the patient tracker (102), in step 314.
[0041] The present disclosure provides various advantages, including but not limited to, computer assisted surgeries performed without the usage of a tracker to track the navigation probe and the patient.
[0042] While specific language has been used to describe the disclosure, any limitations arising on account of the same are not intended. As would be apparent to a person skilled in the art, various working modifications may be made to the method in order to implement the inventive concept as taught herein.
[0043] The figures and the foregoing description give examples of embodiments. Those skilled in the art will appreciate that one or more of the described elements may well be combined into a single functional element. Alternatively, certain elements may be split into multiple functional elements. Elements from one embodiment may be added to another embodiment. For example, order of processes described herein may be changed and are not limited to the manner described herein. Moreover, the actions of any flow diagram need not be implemented in the order shown; nor do all of the acts need to be necessarily performed. Also, those acts that are not dependant on other acts may be performed in parallel with the other acts. The scope of embodiments is by no means limited by these specific examples.

Documents

Application Documents

# Name Date
1 201941047042-STATEMENT OF UNDERTAKING (FORM 3) [19-11-2019(online)].pdf 2019-11-19
2 201941047042-PROOF OF RIGHT [19-11-2019(online)].pdf 2019-11-19
3 201941047042-POWER OF AUTHORITY [19-11-2019(online)].pdf 2019-11-19
4 201941047042-FORM FOR STARTUP [19-11-2019(online)].pdf 2019-11-19
5 201941047042-FORM FOR SMALL ENTITY(FORM-28) [19-11-2019(online)].pdf 2019-11-19
6 201941047042-FORM 1 [19-11-2019(online)].pdf 2019-11-19
7 201941047042-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [19-11-2019(online)].pdf 2019-11-19
8 201941047042-EVIDENCE FOR REGISTRATION UNDER SSI [19-11-2019(online)].pdf 2019-11-19
9 201941047042-DRAWINGS [19-11-2019(online)].pdf 2019-11-19
10 201941047042-DECLARATION OF INVENTORSHIP (FORM 5) [19-11-2019(online)].pdf 2019-11-19
11 201941047042-COMPLETE SPECIFICATION [19-11-2019(online)].pdf 2019-11-19
12 abstract 201941047042.jpg 2019-11-21
13 Correspondence by Agent_Form1-Form3-Form5-Form28-Power of Attorney_25-11-2019.pdf 2019-11-25
14 201941047042-STARTUP [31-12-2019(online)].pdf 2019-12-31
15 201941047042-FORM28 [31-12-2019(online)].pdf 2019-12-31
16 201941047042-FORM-9 [31-12-2019(online)].pdf 2019-12-31
17 201941047042-FORM 18A [31-12-2019(online)].pdf 2019-12-31
18 201941047042-FER.pdf 2020-01-13
19 201941047042-RELEVANT DOCUMENTS [12-05-2020(online)].pdf 2020-05-12
20 201941047042-OTHERS [12-05-2020(online)].pdf 2020-05-12
21 201941047042-MARKED COPIES OF AMENDEMENTS [12-05-2020(online)].pdf 2020-05-12
22 201941047042-FORM 3 [12-05-2020(online)].pdf 2020-05-12
23 201941047042-FORM 13 [12-05-2020(online)].pdf 2020-05-12
24 201941047042-FER_SER_REPLY [12-05-2020(online)].pdf 2020-05-12
25 201941047042-ENDORSEMENT BY INVENTORS [12-05-2020(online)].pdf 2020-05-12
26 201941047042-DRAWING [12-05-2020(online)].pdf 2020-05-12
27 201941047042-CLAIMS [12-05-2020(online)].pdf 2020-05-12
28 201941047042-AMMENDED DOCUMENTS [12-05-2020(online)].pdf 2020-05-12
29 201941047042-US(14)-HearingNotice-(HearingDate-24-08-2020).pdf 2020-07-22
30 201941047042-Correspondence to notify the Controller [30-07-2020(online)].pdf 2020-07-30
31 201941047042-FORM-26 [21-08-2020(online)].pdf 2020-08-21
32 201941047042-Written submissions and relevant documents [08-09-2020(online)].pdf 2020-09-08
33 201941047042-FORM 3 [08-09-2020(online)].pdf 2020-09-08
34 201941047042-RELEVANT DOCUMENTS [19-10-2021(online)].pdf 2021-10-19
35 201941047042-FORM-24 [19-10-2021(online)].pdf 2021-10-19

Search Strategy

1 2020-01-0615-48-41_06-01-2020.pdf