Abstract: TITLE OF INVENTION: SYSTEM AND METHOD FOR ALTERING CUTTING PATH DURING A ROBOTIC SURGERY A method (500) and a computing system (150) for dynamically adjusting a cutting path for a bone during a surgical procedure are disclosed. The computing system (150) retrieves a current cutting path for cutting a bone, the cutting path comprising a plurality of points in a cutting plane, each point associated with coordinates in a two-dimensional space. Intra-operatively, an update request is received to update the current cutting path, the update request comprising a target region to be deleted. A boundary of the target region is determined, comprising a plurality of boundary points. A plurality of inner points of the current cutting path located inside the boundary is identified, along with a plurality of intersection points where the boundary intersects with the current cutting path. An updated cutting path is generated based upon the boundary points, the intersection points, and the inner points, and is sent to a robot (164). Fig. 1b
Description:¬ FORM 2
THE PATENTS ACT, 1970
(39 of 1970)
&
THE PATENTS RULES, 2003
COMPLETE SPECIFICATION
(Section 10 and Rule 13)
1. TITLE OF THE INVENTION:
SYSTEM AND METHOD FOR ALTERING CUTTING PATH DURING A ROBOTIC SURGERY
2. APPLICANT:
Name : Merai Newage Private Limited
Nationality : Indian
Address : Survey No. 1574, Bilakhia House, Chala Muktanand Marg, Vapi, Valsad-396191 Gujarat, India.
3. PREAMBLE TO THE DESCRIPTION
The following specification particularly describes the invention and the manner in which it is to be performed:
FIELD OF INVENTION
The present invention relates to a medical device. More specifically, the present invention relates to system and method for altering a cutting path during a robotic surgery.
BACKGROUND OF INVENTION
Robotic surgery, also known as robot-assisted surgery, refers to a minimally invasive surgical technique where specialized robotic systems are employed to assist surgeons in performing procedures with unparalleled precision and control. The system includes a robotic arm equipped with surgical instruments, a surgical console operated by the surgeon, and a high-definition vision system that provides a magnified 3D view of the surgical site.
The robot assisted surgery is widely used for carrying out orthopedic surgery in a patient. Several imaging techniques such as, computerized tomography (CT) scan, Magnetic resonance imaging (MRI), etc., are used before the surgery to plan a cutting path to cut a bone of a patient, for example, for the placement of an implant. The planned cutting path defines a cutting boundary. The advanced image processing techniques are currently available to build a 3D model of a patient’s anatomy from pre-operative images taken of the patient. Due to such techniques, it is possible to generate the planned cutting path customized according to individual patient’s anatomy.
Despite these advancements, the conventional systems for the robot-assisted surgery possesses certain limitations. For example, a deviation may exist between the patient’s actual anatomy as registered in a registration process during the surgery and the 3D bone model prepared using pre-operative images (e.g., due to deformities, osteophytes, etc.), or there may be a sudden shift in the patient’s anatomy during the surgery. Using the planned (or pre-operative) cutting path in such situations increases the risk of cutting delicate tissues surrounding the bone and/or not cutting enough surface of the bone. This leads to undesired consequences, such as, damage to the patient’s tissues, improper implant fit and so on, thereby not only reducing the effectiveness of the surgical procedure but also compromising the patient’s safety. In conventional systems, whenever the surgeon faces such situations, the surgeon has to frequently pause the cutting operation, manually readjust (or reset) the position/orientation of a cutting tool, and recalibrate the robotic arm, thus, increasing the surgery time. For example, while performing a total knee replacement surgery, a situation may arise where the anterior-posterior (AP) dimension of a patient’s femur match with AP dimensions of standard sizes of the implant, but the medial-lateral (ML) dimensions of the patient’s femur do not, or vice versa. This results in a misfit of the implant. For example, if the implant is chosen according to the matching AP dimensions, and the ML dimensions of the patient’s femur are wider than the implant, it would leave a section of the bone untrimmed. In such a case, the surgeon performs manual adjustments or trimming of the bone causing cuts that may not be aligned to the anatomy of the patient. This may introduce variability and it is more prone to human error, impacting a final fit and stability of the implant. It also makes the overall process more dependent on the surgeon’s skills. On the other hand, if the ML dimensions of the patient’s femur are smaller than the implant’s width, the bone may be trimmed more than what is required, causing the implant to overhang, leading to soft tissue irritation and pain. Thus, in either case, the mismatch between the cutting path and variations in anatomy observed during the surgery results in the misfit of the implant, causing inconvenience to the patient and negatively affecting the surgical outcome. There is no efficient way provided by conventional systems to respond to unforeseen variations in the patient’s anatomy.
Hence, there is a need for a system which can overcome the above limitations.
SUMMARY OF INVENTION
The present invention relates to a method and a computing system for dynamically adjusting a cutting path for a bone during a surgical procedure. The computing system retrieves a current cutting path for cutting a bone, the cutting path comprising a plurality of points in a cutting plane, each point associated with coordinates in a two-dimensional space. Intra-operatively, an update request is received to update the current cutting path, the update request comprising a target region to be deleted from the current cutting path. The computing system determines a boundary of the target region comprising a plurality of boundary points, each associated with coordinates in the two-dimensional space. A plurality of inner points of the current cutting path is identified based upon the boundary, the inner points corresponding to a subset of the plurality of points located inside the boundary. A plurality of intersection points is also identified, corresponding to a subset of the boundary points intersecting with the current cutting path. An updated cutting path is generated based upon the boundary points, the intersection points, and the inner points, and is sent to a surgical robot configured to cut the bone according to the updated cutting path.
The foregoing features and other features as well as the advantages of the invention will become more apparent from the following detailed description, which proceeds with reference to the accompanying figures.
BRIEF DESCRIPTION OF DRAWINGS
The summary above, as well as the following detailed description of illustrative embodiments, is better understood when read in conjunction with the apportioned drawings. For the purpose of illustrating the present disclosure, exemplary constructions of the disclosure are shown in the drawings. However, the disclosure is not limited to specific methods and instrumentalities disclosed herein. Moreover, those in the art will understand that the drawings are not to scale.
Fig. 1a depicts an exemplary implementation of a robotic surgical system 100, according to an embodiment of the present disclosure.
Fig. 1b depicts a block diagram of the robotic surgical system 100, according to an embodiment of the present disclosure.
Fig. 1c depicts an exemplary user interface 110 including a current cutting path 112 overlaid on a 3D bone model 113, according to an embodiment of the present disclosure.
Figs. 2a-2e depict a visual representation of steps involved in generating an updated cutting path 206 by deleting a target region 204 from a current cutting path 200, according to an embodiment of the present disclosure.
Fig. 2f – 2g depict a visual representation of an updated cutting path 212 generated by deleting a target region 209 from a current cutting path 208, according to another embodiment of the present disclosure.
Fig. 3a depicts a visual representation of an updated cutting path 314 generated by scaling a current cutting path 312, according to an embodiment of the present disclosure.
Fig. 3b depicts a visual representation of an updated cutting path 320 generated by scaling a current cutting path 318, according to another embodiment of the present disclosure.
Figs. 4a-4b depicts an exemplary user interface 400 to facilitate dynamically altering the current cutting path, according to an embodiment of the present disclosure.
Fig. 5 depicts a flowchart of a method 500 for altering a current cutting path, according to an embodiment of the present disclosure.
Fig. 6 depicts a flowchart of a method 600 for altering a current cutting path, according to another embodiment of the present disclosure.
Fig. 7a depicts an example scenario in a conventional system.
Fig. 7b depicts an example scenario using the robotic surgical system 100, according to an embodiment of the present disclosure.
Fig. 7c depicts an example scenario in a conventional system.
Fig. 7d depicts an example scenario using the robotic surgical system 100, according to an embodiment of the present disclosure.
DETAILED DESCRIPTION OF THE ACCOMPANYING DRAWINGS
Prior to describing the invention in detail, definitions of certain words or phrases used throughout this patent document will be defined: the terms "include" and "comprise", as well as derivatives thereof, mean inclusion without limitation; the term "or" is inclusive, meaning and/or; the phrases "coupled with" and "associated therewith", as well as derivatives thereof, may mean to include, be included within, interconnect with, contain, be contained within, connect to or with, couple to or with, be communicable with, cooperate with, interleave, juxtapose, be proximate to, be bound to or with, have a property of, or the like; Definitions of certain words and phrases are provided throughout this patent document, and those of ordinary skill in the art will understand that such definitions apply in many, if not most, instances to prior as well as future uses of such defined words and phrases.
Reference throughout this specification to “one embodiment,” “an embodiment,” or similar language means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment. Thus, appearances of the phrases “in one embodiment,” “in an embodiment,” and similar language throughout this specification may, but do not necessarily, all refer to the same embodiment, but mean “one or more but not all embodiments” unless expressly specified otherwise. The terms “including,” “comprising,” “having,” and variations thereof mean “including but not limited to” unless expressly specified otherwise. An enumerated listing of items does not imply that any or all of the items are mutually exclusive and/or mutually inclusive, unless expressly specified otherwise. The terms “a,” “an,” and “the” also refer to “one or more” unless expressly specified otherwise.
Although the operations of exemplary embodiments of the disclosed method may be described in a particular, sequential order for convenient presentation, it should be understood that the disclosed embodiments can encompass an order of operations other than the particular, sequential order disclosed. For example, operations described sequentially may in some cases be rearranged or performed concurrently. Further, descriptions and disclosures provided in association with one particular embodiment are not limited to that embodiment, and may be applied to any embodiment disclosed herein. Moreover, for the sake of simplicity, the attached figures may not show the various ways in which the disclosed system, method, and apparatus can be used in combination with other systems, methods, and apparatuses.
The embodiments are described below with reference to block diagrams and/or data flow illustrations of methods, apparatus, systems, and computer program products. It should be understood that each block of the block diagrams and/or data flow illustrations, respectively, may be implemented in part by computer program instructions, e.g., as logical steps or operations executing on a processor in a computing system. These computer program instructions may be loaded onto a computer, such as a special purpose computer or other programmable data processing apparatus to produce a specifically-configured machine, such that the instructions which execute on the computer or other programmable data processing apparatus implement the functions specified in the data flow illustrations or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including computer-readable instructions for implementing the functionality specified in the flowchart block or blocks. The computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer-implemented process such that the instructions that execute on the computer or other programmable apparatus provide operations for implementing the functions specified in the data flow illustrations or blocks.
Accordingly, blocks of the block diagrams and data flow illustrations support various combinations for performing the specified functions, combinations of operations for performing the specified functions and program instructions for performing the specified functions. It should also be understood that each block of the block diagrams and flowchart illustrations, and combinations of blocks in the block diagrams and flowchart illustrations, can be implemented by special purpose hardware-based computer systems that perform the specified functions or operations, or combinations of special purpose hardware and computer instructions. Further, applications, software programs or computer readable instructions may be referred to as components or modules. Applications may be hardwired or hardcoded in hardware or take the form of software executing on a general-purpose computer such that when the software is loaded into and/or executed by the computer, the computer becomes an apparatus for practicing the disclosure, or they are available via a web service. Applications may also be downloaded in whole or in part through the use of a software development kit or a toolkit that enables the creation and implementation of the present disclosure. In this specification, these implementations, or any other form that the disclosure may take, may be referred to as techniques. In general, the order of the steps of disclosed processes may be altered within the scope of the disclosure.
Furthermore, the described features, advantages, and characteristics of the embodiments may be combined in any suitable manner. One skilled in the relevant art will recognize that the embodiments may be practiced without one or more of the specific features or advantages of a particular embodiment. In other instances, additional features and advantages may be recognized in certain embodiments that may not be present in all embodiments. These features and advantages of the embodiments will become more fully apparent from the following description and apportioned claims, or may be learned by the practice of embodiments as set forth hereinafter.
Various methods described herein may be practiced by combining one or more machine-readable storage media containing the code according to the present invention with appropriate standard computer hardware to execute the code contained therein. An apparatus for practicing various embodiments of the present invention may involve one or more computers (or one or more processors within a single computer) and storage systems containing or having network access to computer program(s) coded in accordance with various methods described herein, and the method steps of the invention could be accomplished by modules, routines, subroutines, or subparts of a computer program.
In accordance with the present disclosure, a system and method for dynamically modify a current cutting path intra-operatively in an orthopedic surgery, are disclosed. In an embodiment, the system includes a computing system configured to generate an updated cutting path from a current cutting path based upon an update request received from a user. Further, the computing system is configured to control the robot based on the updated cutting path, for example, the computing system sends the updated cutting path to the robot, which in turn, controls a cutting tool coupled to the robot according to the updated cutting path.
The proposed system presents several advantages over the conventionally available systems. The proposed system enables the user to dynamically adjust the cutting path intra-operatively according to the patient’s anatomy. Thus, the system gives flexibility to the surgeon to respond to unforeseen anatomical variations, dynamic behavior of soft tissues and/or bones during the surgical procedure. It minimizes the risk of damaging critical anatomical structures (e.g., tissues) surrounding the bone. This ensures the safety of the patient. The system provides a user-friendly interface to modify the current cutting path, eliminating the need for frequent, manual adjustment of the cutting tool’s position and robot calibration, thereby, reducing human errors and decreasing overall surgery time. Further, in an embodiment, the system is capable of verifying the modified cutting path against the three-dimensional bone profile of the patient, to ensure that the modified cutting path lies within the boundary of the bone profile. This facilitates the user to detect any errors in the updated cutting path designed by the user thus, reducing the chances of incorrect cuts on the bone and/or the surrounding tissue, further enhancing the precision and efficacy of the surgical procedure. The system also allows the user to view the cutting path traced by the robot in real time, and facilitates the surgeon to pause the cutting operation at any time and update the current cutting path as desired. This further enhances the usability of the system. In addition, by dynamically updating the cutting path itself, the system enables the robot to remove the excess portion of the bone during the robotic cutting, ensuring a smooth bone profile for the implant to sit as opposed to manual trimming seen in conventional systems, which may lead to sharp edges, increasing changes of internal injury or irritation for the patient. Removing the excess bone region also provides space to the surgeon for shifting the implant as needed, thereby improving the placement of the implant. Similarly, avoiding unnecessary bone region from cutting reduces trauma to the patient. Overall, the system and method enhance the surgical accuracy, make the process more efficient and ensure more accurate placement of an implant which leads to better clinical outcomes.
Referring to figures, Fig. 1a depicts an exemplary implementation of a robotic surgical system 100 (hereinafter referred as a system 100) used in an orthopedic surgery. Fig. 1b depicts a block diagram of the system 100. In an embodiment, the system 100 includes a computing system 150, a tracking system 162 and a robot 164.
The robot 164 is configured to perform one or more steps in a surgery. In an embodiment, the robot 164 is configured to perform one or more steps in an orthopedic surgery. Examples of the orthopedic surgery include, without limitation, knee replacement surgery (e.g., total, partial, uni-compartmental, etc.), hip replacement surgery, shoulder replacement surgery, reverse shoulder replacement surgery, spinal surgery, etc. In an embodiment, the robot 164 includes a base 164a, and at least one robotic arm 164b (hereinafter, robotic arm 164b) coupled to the base 164a. The base 164a acts as a support for robot 164. The base 164a may have any desired shape (e.g., generally cuboidal). Optionally, the base 164a is provided with a plurality of wheels (not shown) at its bottom so that the robot 164 is movable. This helps in adjusting a position of the robot 164 as desired. The height of the base 164a may be adjustable, which helps in adjusting the robot 164 to an optimal vertical height for the surgery. A proximal end of the robotic arm 164b is fixedly or removably mounted on the base 164a. The robotic arm 164b includes a plurality of pivotable links or joints designed to facilitate movement of the robotic arm 164b in desired degrees of freedom. In an example implementation, the robotic arm 164b is designed to have six degrees of freedom. A surgical tool 164d is coupled to the robotic arm 164b at a distal end 164e of the robotic arm 164b. The surgical tool 164d may be fixedly or removably coupled to the robotic arm 164b. Examples of the surgical tool 164d include, without limitation, a milling tool, a drilling tool, a cutting guide, etc. In an embodiment, the surgical tool 164d includes a cutting tool 164c (also, referred to as an end effector 164c). The cutting tool 164c is configured to cut at least a portion of a target bone (e.g., a femur 166a and/or a tibia 166b) during a surgical procedure such as a knee replacement surgery. The cutting tool 164c may include, without limitation, a reamer, a saw blade, a burr, a drill, an ultrasound cutting tool, a laser cutter tool, an electrode, etc. In an embodiment, the robot 164 performs one or more steps in the surgery based upon instructions received from the computing system 150. The computing system 150 controls the operation of the robot 164 and the surgical tool coupled to the robot 164 based, at least, upon surgical guidance information generated at a pre-planning stage of the surgery. The base 164a may house an actuation assembly and/or control circuitry to control movement of the robotic arm 164b (e.g., position and orientation) and operation of the surgical tool in a precise manner. The robot 164 is communicatively coupled to the computing system 150. The robot 164 is communicatively coupled to the computing system 150, for example, via a wired interface (e.g., HDMI, LAN, Ethernet, etc.), a wireless interface (e.g., Bluetooth, Wi-Fi, etc.), or combinations thereof. As described herein, communicatively means capable of exchanging data and/or signals.
The tracking system 162 facilitates real-time tracking of the progress of the surgery. For example, the tracking system 162 helps in tracking one or more of: the position and/or orientation of the robot 164, the target bone, and the surgical tool, the real-time movement of the surgical tool 164d (e.g., the cutting tool 164c), and the like. The tracking system 162 also plays a role in registering the patient’s anatomy, e.g., contour of the target bone(s), with a corresponding 3D model of the target bone(s) by tracking the position and orientation of a registration probe to register a plurality of anatomical points or landmarks. This facilitates accurate alignment of the patient’s anatomy with the 3D model. The 3D model of the patient’s anatomy may be created during the pre-planning phase or intra-operatively using any technique known in the art. In an embodiment, the tracking system 162 is an optical tracking system and accordingly includes one or more optical markers (hereinafter, markers) and one or more cameras 162b. It should be understood that any other tracking system may be provided without deviating from the scope of the present disclosure. The one or more markers of the tracking system 162 are mounted on the target bone of the patient, the robot 164 and/or the surgical tool 164d. The markers may be provided on the base 164a, the arm 164b, and/or the end effector 164c. By way of example, markers 162a coupled to a femur 166a and a tibia 166b are illustrated in Fig. 1. In an example implementation, the tracking system includes two cameras 162b spaced apart from each other. The cameras 162b are mounted over a casing 150a with the help of a support rod 162c. The camera 162b is configured to track the position of the one or more markers such as, the markers 162a. In an embodiment, the markers include one or more reflecting elements. In an embodiment, the tracking system 162 includes a light source to illuminate a surgical area using a light (visible or infrared) and the light reflected by the markers is captured by the cameras 162b. The tracking system 162 determines the positions of the markers based upon the reflected light and determines the spatial coordinates of the robot 164, the cutting tool 164c, the target bone, etc. based upon the positions of the respective markers. The tracking system 162 sends the real-time spatial coordinates of the robot 164, the cutting tool 164c and the target bone, etc. to the computing system 150, which then uses this information for real-time navigation and/or guidance of the surgery. The tracking system 162 is communicatively coupled to the computing system 150. The tracking system 162 is communicatively coupled to the computing system 150 via a wired interface (e.g., LAN), a wireless interface (e.g., Bluetooth, Wi-Fi) or combinations thereof. In an example implementation, the robot 164 and the tracking system 162 are communicatively coupled to the computing system 150 via a respective signaling cable. For example, the signaling cable may include an Ethernet (LAN, TCP/IP) cable, a USB 3.0/USB-C cable, a Controller Area Network (CAN) bus cable, or a fiber-optic Ethernet cable.
The computing system 150 navigates through various stages of the surgery and provides relevant guidance information to the surgeon to enable the surgeon to perform the surgery more precisely. The guidance information includes, for example, surgical plan (e.g., entry points, resection angles, target areas, planned locations of implants, etc.), anatomical model of the patient’s anatomy, intraoperative images (which may be overlaid on the anatomical model), real-time tracking information of the surgical tools relative to the patient’s anatomy, instructions for the surgeon, feedback, etc., related to each stage of the surgery. The computing system 150 also controls the robot 164 to perform various tasks related to the surgery. For example, computing system 150 sends a cutting path to the robot 164 and the cutting tool 164c cuts the target bone based upon the cutting path. According to an embodiment, the computing system 150 includes a display 152a, at least one input device 154a, a memory 158, and a processing unit 160. The processing unit 160 includes one or more processors 160a coupled to the memory 158. The memory 158 stores a set of instructions, that when executed by the one or more processors 160a, cause the processing unit 160 to perform various functions of the processing unit 160 as described herein. The one or more processors 160a may include a microprocessor, a central processing unit (CPU), a personal computer, a graphical processing unit (GPU), an application-specific integrated circuit (ASIC), or any other computing device capable of executing machine-readable instructions. The memory 158 includes a volatile and/or non-volatile memory. For example, the memory 158 may include a hard disk drive, a solid-state drive, a flash drive, a database, or any other storage system capable of storing machine-readable instructions.
The display 152a is configured to display various information and/or user interfaces as explained herein, under the control of the processing unit 160. The display 152a may be of any known type, such as, LED, LCD, or the like. The display 152a, optionally, has a touch-sensitive capability. The display 152a is communicatively coupled to the processing unit 160. In an embodiment, the computing system 150 includes a display interface unit 152 for interfacing the processing unit 160 with the display 152a via, for example, a wired connection (e.g., HDMI, Ethernet, or the like) or a wireless connection (e.g., Bluetooth, Wi-Fi, or the like). The display interface unit 152 includes associated hardware and/or software, e.g., a graphics card, a graphical processing unit (GPU), a display driver, Bluetooth interface circuitry, HDMI interface circuitry, etc. The at least one input device 154a is configured to receive inputs from the surgeon and send the inputs to the processing unit 160. The at least one input device 154a includes one or more of: a keyboard, a mouse, a stylus, a touch-screen, a touch pad, or the like. The at least one input device 154a is communicatively coupled to the processing unit. In an embodiment, the computing system 150 includes an input interface unit 154 for interfacing the processing unit 160 with the at least one input device 154a, via, for example, a wired connection (e.g., USB or the like) or a wireless connection (e.g., Bluetooth or the like). The input interface unit 154 includes associated hardware/software, e.g., an I/O controller, a USB driver, Bluetooth interface circuitry, etc.
In an embodiment, at least some components of the computing system 150 are encased in a casing 150a. For example, the processing unit 160, the memory 158, the display interface unit 152, the input interface unit 154, and the communication interface unit 156 are provided in the casing 150a. In an embodiment, the display 152a and the at least one input device 154a are mounted over the casing 150a, for example, on a top surface of the casing 150a as shown in Fig. 1a. Other implementation scenarios are also contemplated herein. For example, the computing system 150 may be provided with the robot 164. In another example, a part of the computing system 150, say, the processing unit 160 and the memory 158 are provided with the robot 164 (e.g., disposed within a space enclosed by the base 164a) while the display 152a and the input device 154a are provided on the top surface of the casing 150a. In an embodiment, the display 152a may be located remotely, e.g., provided on a wall of the operating room. Optionally, the casing 150a is provided with wheels at the bottom so that the position of the casing 150a can be adjusted as desired.
An orthopedic surgery involves cutting at least a portion (hereinafter interchangeably referred to as a target or desired area) of a target bone. In an embodiment, the robot 164 is configured to control the cutting tool 164c to cut the target area based upon the cutting path received from the computing system 150. The surgeon may desire to change an existing (or current) cutting path intra-operatively under various situations. For example, the surgeon may observe a deviation in the patient’s anatomy compared to the 3D model obtained from pre-operative images (e.g., CT or MRI) and determine that following a pre-planning cutting path is not optimal for the patient. For example, in a knee replacement procedure, the surgeon may find that a cutting boundary defined by the pre-planned cutting path extends outside the surface area of the patient’s femur and may damage surrounding tissues (e.g., ligaments, soft tissues, nerves, etc.). Similarly, in a spine surgery, the surgeon may determine that following the current cutting path may increase a risk of damaging the spinal cord or nerve roots. In such cases, altering the pre-planned cutting path becomes necessary to prevent damage to the surrounding tissues. In another example, the surgeon may find that the pre-planned cutting path is unable to fully cover the target area of the target bone. In such a case too, the altering the pre-planned cutting path becomes crucial otherwise it may result in incomplete cutting of the bone surface. The surgeon may also want to change the current cutting path due to unexpected shifts in the patient’s anatomy during the surgery, or errors in the pre-planned cutting path. Further, the surgeon may observe that the pre-planned cutting path may result in overcutting (which leaves insufficient space for supporting the implant), in undercutting (which leads to insufficient resection of bone for fitting the implant properly). It is also possible that the current cutting path is designed based upon a pre-planned implant size (say, size 4), but intraoperatively, the surgeon determines that an implant of a different size (say, size 5) fits better according to the patient’s anatomy. In such cases, the surgeon determines that the pre-planned cutting path is not optimal with respect to the implant fitting, and the surgeon may wish to alter the current cutting path to ensure that the resected bone surface matches the geometry and dimensions of the implant to ensure proper fit and alignment. Unlike a conventional system, the system 100 enables the surgeon to modify a current cutting path intra-operatively. In an embodiment, the computing system 150 is configured to dynamically alter (or modify) the current cutting path during a surgery (i.e., intra-operatively). Various embodiments related to the dynamic changing of the cutting path are now explained below.
The processing unit 160 is configured to retrieve a current cutting path. The current cutting path defines a cutting plane and includes a plurality of points in the cutting plane. Each point is associated with (or, in other words, is defined by) coordinates in a two-dimensional space. In an embodiment, the current cutting path includes a pre-planned cutting path defined in a planning phase of the surgery. That is, the pre-planned cutting path corresponds to a cutting path defined at pre-operative stage. In one embodiment, the pre-planned cutting path is pre-stored in the memory 158 and the processing unit 160 retrieves the pre-planned cutting path from the memory 158. In another embodiment, the processing unit 160 retrieves the pre-planned cutting path from a remote storage device, e.g., a USB storage device, a remote database, a server or a cloud storage system. In an embodiment, the computing system 150 includes a communication interface unit 156 to interface with the remote storage device. Accordingly, the communication interface unit 156 includes associated hardware and/or software for communicating with the remote storage device. For example, the communication interface unit 156 may include a Bluetooth module or a Wi-Fi module to establish a connection with the remote storage device via the respective wireless network. In another example, the communication interface unit 156 may include an ethernet interface circuitry to establish a wired connection with the remote storage device via the internet. In yet another example, the communication interface unit 156 includes a USB interface circuitry to interface a USB storage device (storing the pre-planned cutting path) which is directly plugged into a USB port of the computing system 150. In an embodiment, the processing unit 160 obtains the pre-planned cutting path from the user with the help of a suitable user interface displayed on the display 152a. For example, such a user interface may include an upload button to upload a file including the pre-planned cutting path. In an embodiment, the current cutting path includes a cutting path generated intra-operatively. In this case, the processing unit 160 retrieves the current cutting path from the memory 158.
The processing unit 160 is configured to display the current cutting path (e.g., the pre-planned cutting path) on the display 152a. Optionally, the processing unit 160 overlays the 3D model of the target area of the target bone on the current cutting path. For example, the processing unit 160 generates and displays a user interface, such as, a user interface 110 illustrated in Fig. 1c on the display 152a. In an embodiment, the processing unit 160 displays an image 111 including the current cutting path 112 and a 3D bone model 113 overlaid with each other. Overlaying the 3D model 113 of the patient’s anatomy with the current cutting path 112 helps the surgeon (interchangeably referred to as a user) to determine whether the current cutting path 112 needs to be updated. In an embodiment, the user interface 110 includes a first user element 114a enabling the surgeon to indicate that the surgeon wants to update the current cutting path 112. In an embodiment, the user interface 110 also includes a second user element 114b enabling the surgeon to indicate that the surgeon does not want to update the current cutting path 112. The first user element 114a and the second user element 114b may be buttons, radio buttons, checkboxes, or the like. In an example implementation, the first user element 114a and the second user element 114b are buttons as depicted in Fig. 1c. The processing unit 160 is configured to receive an indication for updating the current cutting path (hereinafter, a first indication), e.g., when the surgeon clicks on the first user element 114a. Upon receiving the first indication, the processing unit 160 is configured to initiate a process for updating the current cutting path 112. Upon receiving an indication (hereinafter, a second indication) that the surgeon does not want to update the current cutting path 112, e.g., when the surgeon clicks on the second user element 114b, the processing unit 160 sends the current cutting path 112 to the robot 164, instructing the robot 164 to cut the target area of the target bone according to the current cutting path 112. The processing unit 160 may also allow the surgeon to set cutting parameters such as, cutting speed, cutting force, depth of cut, etc. at this stage.
In an embodiment, in response to receiving the first indication, the processing unit 160 is configured to generate and display a user interface, such as, a user interface 400 as illustrated in Fig 4. The user interface 400 includes a first image 402 showing an exemplary current cutting path 402a. The user interface 400 enables the surgeon to update the current cutting path 402a intraoperatively. The processing unit 160 facilitates the surgeon to update the current cutting path 402a in different ways, such as, scaling the current cutting path 402a, deleting a portion of the current cutting path 402a, etc. or combinations thereof. The processing unit 160 is configured to receive an update request to update the current cutting path 402a, e.g., via the user interface 400. The update request includes an input related to updating the current cutting path 402a. It should be understood that the current cutting path 402a illustrated herein is merely exemplary, and it may take any other form or shape depending upon the surgical requirements.
In an embodiment, the update request includes a region (or a target region) to be deleted from the current cutting path. The target region has a pre-defined shape and/or a freehand shape. The processing unit 160 enables the surgeon to define the target region with the help of a user interface, such as the user interface 400. For example, the user interface 400 includes a first section 406 (interchangeably referred to as a first update section 406) including one or more user elements to allow the surgeon to define the target region. In an embodiment, the first update section 406 includes a first window 406a having a list including a plurality of predefined shapes. Examples of the predefined shapes include, without limitation, square, rectangle, circle, ellipse, triangle, trapezium, polygon, or the like. In an embodiment, the first window 406a includes a plurality of buttons, each button corresponding to a respective predefined shape, for example, a button 406d, a button 406e, a button 406f and a button 406g corresponding to shapes, square, rectangle, circle, and ellipse, respectively. The user may select a desired predefined shape from the list by clicking on the respective button and then place the selected predefined shape over the current cutting path 402a (displayed in the first image 402) at a desired location to define the target region. The surgeon may determine the location of the target region based upon, for example, the patient’s anatomy. The target region corresponds to a region of the current cutting path 402a inside the predefined shape. The predefined shape at least partially overlaps the current cutting path 402a. It should be understood that the number of predefined shapes (four, in this case) displayed to the user is merely exemplary. Less than or more than four predefined shapes may be displayed without deviating from the scope of the present disclosure. Further, the specific predefined shapes (in this example, square, rectangle, circle, ellipse) are given for illustration purpose only, other predefined shape may be displayed instead of, or in addition to, these predefined shapes. In an embodiment, the number of predefined shapes and specific predefined shapes to be displayed in the first window 406a are user configurable. Further, it should be understood that the selection of the desired predefined shape with the help of a button as described herein is merely exemplary. The first window 406a may include other user elements such as, check-box, radio buttons, a drop-down list, or the like, to enable the user to select the desired predefined shape.
In an embodiment, the first update section 406 includes a second window 406b (shown in Fig. 4), instead of, or in addition to, the first window 406a. The second window 406b allows the user to draw a freehand shape to define the target region. In an embodiment, the second window 406b includes a user interface element, for example, a button 406c, to actuate a freehand drawing tool. When the user clicks on the button 406c, the processing unit 160 actuates the freehand drawing tool, facilitating the user to draw any desired freehand shape over the current cutting path 402a at a desired location determined, for example, based upon the patient’s anatomy. In an embodiment, the user draws the freehand shape over the current cutting path 402a using the input device 154a. In another embodiment, the user draws the freehand shape directly on the display 152a (for example, when the display 152a is touch-enabled). By way of example, Fig. 2f illustrates the current cutting path 208 and the freehand shape 210 drawn by the user. The portion of the current cutting path 208 located inside the freehand shape 210 corresponds to the target region in this example.
Upon receiving the update request having the target region, the processing unit 160 updates the current cutting path. Various embodiments related to this have been explained below with reference to Figs. 2a – 2g. Fig. 2a illustrates an exemplary current cutting path 200. The current cutting path 200 defines a cutting plane 200a (without loss of generality an X-Y plane) having an origin 200b. The current cutting path 200 includes a plurality of points P1, P2, P3, P4 and so on, having respective two-dimensional (e.g., X-Y) coordinates. Arrows 201 denote a direction (also, referred to as a forward direction) of the cutting path 200. The forward direction represents a direction in which cutting is to be done, i.e., a direction in which the cutting tool moves forward to cut the bone. Consider for example that the user placed an ellipse 202 over the current cutting path 200 as shown in Fig. 2a to define a target region 204. The target region 204 is the area inside the ellipse 202. In an embodiment, the processing unit 160 is configured to determine a boundary 204a of the target region 204. The boundary 204a includes a plurality of boundary points. Each boundary point is associated with (i.e., is defined by) coordinates in the two-dimensional space corresponding to the cutting plane 200a. The processing unit 160 is configured to compute the coordinates of the boundary points with respect to the origin 200b of the cutting plane 200a, thereby determining the boundary 204a of the target region 204. The processing unit 160 determines the boundary 204a of the target region 204 using a suitable technique such as, without limitation include, grid-based rasterization, spatial indexing (such as, Quadtrees, k-d trees, etc. In an embodiment, the boundary 204a of the target region 204 is determined using the boundary representation (B-rep) technique. Fig. 2b visually depicts exemplary boundary points B1 – B6 determined by the processing unit 160.
The processing unit 160 is configured to identify a subset of the plurality of points (hereinafter, a plurality of inner points or inner points) on the current cutting path 200, that are located inside the boundary 204a of the target region 204. In other words, the inner points correspond to all those points on the current cutting path 200 that are located inside the boundary 204a of the target region 204. The processing unit 160 identifies the inner points using a technique, such as, without limitation, ray casting algorithm, Boolean difference method, grid-based rasterization, Weiler-Atherton algorithm, etc. In an embodiment, the processing unit 160 uses Boolean difference technique to identify the inner points. Fig. 2c visually depicts exemplary inner points C1 – C8 determined by the processing unit 160.
The processing unit 160 is configured to identify a plurality of intersection points based upon the boundary points and the plurality of points in the current cutting path 200. The intersection points correspond to a subset of the boundary points that are intersecting with the current cutting path 200. In other words, the intersection points are those boundary points that also lie on the current cutting path 200. This involves identifying the points where the boundary 204a of the target region 204 intersects with the current cutting path 200. The processing unit 160 uses a technique, such as, without limitation, ray tracing, boundary representation (B-Rep), geometric analysis, etc. to identify the intersection points. In an embodiment, the processing unit 160 uses the boundary representation (B-Rep) technique to identify the intersection points.
According to an embodiment, the processing unit 160 determines an equation of each of the line segment of the current cutting path 200 intersecting with the target region 204 and an equation of the predefined shape (the ellipse 202 in this example) that defines the target region 204. The processing unit 160 is configured to determine one or more boundary points that satisfy both - the equation of the corresponding line segment and the equation of the predefined shape – and classify them as the intersection points. Fig. 2d visually depicts exemplary intersection points D1 – D6 determined by the processing unit 160.
The processing unit 160 is configured to generate an updated cutting path based upon the boundary points, the intersection points and the inner points. In an embodiment, the processing unit 160 is configured to determine a plurality of pairs of intersection points. The plurality of intersection points may include at least one pair of intersection points. Each pair of intersection points includes a first intersection point and a second intersection point adjacently disposed according to the forward direction of the current cutting path 200. For example, referring to Fig. 2d, the intersection points D1-D2 form one pair of intersection points where D1 is the first intersection point and D2 is the second intersection point of this pair. Similarly, the intersection points D3-D4 form another such pair of intersection points, where D3 is the first intersection point and D4 is the second intersection point, and so forth. For each pair of intersection points, the processing unit 160 is configured to identify those inner points of the plurality of inner points along or lying on the current cutting path 200 and extending from the first intersection point to the second intersection point (hereinafter referred as a plurality of first inner points or first inner points), and those boundary points of the plurality of boundary points lying on the boundary 204a of the target region 204 and extending from the first intersection point to the second intersection point (hereinafter referred as a plurality of first boundary points or first boundary points). The processing unit 160 is configured to replace the first inner points with the first boundary points. This is explained by way of an example with reference to Fig. 2d1. For the pair of intersection points D1-D2, the processing unit 160 identifies the first inner points, such as points C1 – C3 between the intersection points D1-D2 in the forward direction, and identifies the first boundary points, such as, points B1 – B3 between the intersection points D1-D2 in the forward direction. The processing unit 160 replaces the first inner points C1 – C3 with the first boundary points B1 – B3 to form a section of an updated cutting path from the point D1 to D2. The processing unit 160 repeats this process for all pairs of intersection points, e.g., for the pair D3-D4, the pair D5-D6 and so forth. Performing such a replacement for all pairs of intersection points results in an updated cutting path, such as, an updated cutting path 206 illustrated in Fig. 2e.
Figs. 2f and 2g visually represent another example of deleting the target region from a current cutting path. For example, Fig. 2f depicts an exemplary current cutting path 208. The user defines an exemplary target region 209 by placing a freehand shape 210 using, for example, the user interface 400 as described earlier. The processing unit 160 determines the boundary points, the inner points and intersection points corresponding to the target region 209, and generates an updated cutting path 212 accordingly using a similar method as described earlier. It should be appreciated that though Figs. 2a – 2g illustrate deletion of a target region from the current cutting path at a single location, it is possible that the surgeon may define multiple target regions at desired locations using the user interface 400 (either simultaneously or sequentially) and the processing unit 160 generates a corresponding updated cutting path accordingly.
In an embodiment, the update request includes a scaling factor for scaling the current cutting path. The scaling factor dictates the extent to which a cutting boundary and overall area of the current cutting path is to be increased or reduced. For example, the scaling factor with a value less than one corresponds to scaling down the cutting boundary and the area of the current cutting path while, the scaling factor with a value more than one corresponds to scaling up the cutting boundary and the area of the current cutting path. Preferably, the scaling factor is between 0.7 and 1.3 for a knee replacement surgery. In an embodiment, the processing unit 160 receives a separate scaling factor for each of the two coordinates (e.g., x-coordinate and y-coordinate) of the plurality of points on the current cutting path. For example, ‘Sx’ corresponds to the scaling factor to be applied on the x-coordinate and ‘Sy’ corresponds to the scaling factor to be applied on the y-coordinate of each point on the current cutting path. In an embodiment, the processing unit 160 receives a single scaling factor for both of the coordinates (e.g., the x-coordinate and the y-coordinate) of the points on the current cutting path. The processing unit 160 enables the surgeon to provide the scaling factor with the help of a user interface, such as the user interface 400. For example, the user interface 400 includes a second section 408 (interchangeably referred to as a second update section 408) including one or more user elements to allow the surgeon to provide the scaling factor. For example, the user interface 400 includes an input field 408a with a placeholder 408b for the user to enter the scaling factor as depicted in Fig. 4. In this example, the scaling factor applies to both – x and y coordinates. As explained earlier, it is possible though that the input field 408a includes two placeholders 408b to enable the user to enter two separate scaling factors.
Upon receiving the scaling factor, the processing unit 160 is configured to execute a set of instructions to compute updated values for the coordinates of each of the plurality of points on the current cutting path based upon the scaling factor to generate the updated cutting path. In an embodiment, the processing unit 160 computes the updated values of the coordinates of the points by multiplying current or existing values of the coordinates of the points with the respective scaling factor.
Consider, for example, that Pi = (Xi, Yi), for i = 1,2, …, N denote the points of the current cutting path, ‘Pi’ represents the ith point of the current cutting path, Xi, Yi are the x-coordinate and y-coordinate values of the ith point and N equals the total number of points in the current cutting path. Further, let, Xi’ and Yi’ be the updated values of the x and y coordinates, respectively, of the ith point of the current cutting path. Then, the updated coordinated values are computed as follows: Xi' = Sx * Xi and Yi' = Sy * Yi. The computation of the updated values of the coordinates can be represented in the matrix format as follows:
[■(X^'@Y^' )] = [■(Sx&0@0&Sy)] [■(X@Y)] , where, [■(X@Y)] represents a vector of x and y coordinates (Xi, Yi) of the points on the current cutting path, [■(X^'@Y^' )] represents a vector of the updated x and y coordinate values (Xi', Yi'), and Sx and Sy are diagonal matrices representing the scaling factor for the x-coordinates and the y-coordinates, respectively.
Two examples related to generating the updated cutting path based upon the scaling factor are illustrated in Figs. 3a – 3b. In an exemplary implementation, Fig. 3a depicts a current cutting path 312 having a cutting plane 300. By way of example, the scaling factor having a value equal to 0.8 is received by the processing unit 160. As explained earlier, the processing unit 160 computes the updated values of the coordinates of each point of the current cutting path 312 by multiplying existing value of the coordinates by the scaling factor (i.e., 0.8 in this example). Fig. 3a illustrates an exemplary updated cutting path 314. Thus, in this case, the current cutting path 312 is scaled down by a factor of 0.8 to obtain the updated cutting path 314. In another exemplary implementation, Fig. 3b depicts a current cutting path 318 and a corresponding generated updated cutting path 320 on a cutting plane 316. In this example, the scaling factor is 1.2, i.e., the current cutting path 318 is scaled up by a factor of 1.2 to obtain the updated cutting path 320.
In an embodiment, the processing unit 160 displays the current cutting path and the corresponding updated cutting path to the user. For example, the processing unit 160 generates an image including the current cutting path and the updated cutting path overlaid on each other. Optionally, or in addition, the processing unit 160 displays a bone profile (two dimensional or three dimensional) of the target bone overlaid on the current cutting path and the updated cutting path. In an embodiment, the processing unit 160 displays a two-dimensional bone profile. Visually displaying the current cutting path and the updated cutting path overlaid on each other, enables the user to see the updated cutting path vis-à-vis the current cutting path and determine whether the updated cutting path meets the surgeon’s requirements for a given patient anatomy. Showing the 2D bone profile overlaid on the current and updated cutting paths allow the surgeon to assess how the updated cutting path would impact the patient’s bone and/or surrounding tissues. If the update cutting path does not meet the surgeon’s requirements, the surgeon can re-generate another updated cutting path by changing one or more parameters (e.g., the scaling factor, position of the target region, shape and/or size of the target region, etc.) until the updated cutting path meets the surgeon’s requirements. In an embodiment, the processing unit 160 displays the image on a user interface, such as the user interface 400. For example, the user interface 400 includes a second image 404 showing a current cutting path 402a and an updated cutting path 404a. In the depicted example, the updated cutting path 404a is generated by scaling the current cutting path 402a by a scaling factor of 0.8. The processing unit 160 generates the second image 404. In an embodiment, the processing unit 160 enables the surgeon to select whether to show the 2D bone profile or not, via a user interface element 414 provided on the user interface 400. In an example implementation, the user interface element 414 includes a checkbox , though it is possible that the user element may alternatively be, for example, a button. The processing unit 160 displays the 2D bone profile 414a when the user checks the checkbox as shown in Fig. 4.
The processing unit 160 is configured to send the updated cutting path to the robot 164 which in turn cuts the bone according to the updated cutting path. In an embodiment, the processing unit 160 sends a file containing the updated cutting path to the robot 164. The file has a predefined format, e.g., .txt format. Additionally, or optionally, the processing unit 160 is configured to verify the updated cutting path. In an embodiment, the processing unit 160 verifies the updated cutting path prior to sending the updated cutting path to the robot 164. In an embodiment, the processing unit 160 checks the updated cutting path against the three-dimensional bone profile of the patient, to ensure that the updated cutting path lies within the boundary of the bone profile. In an embodiment, the processing unit 160 checks the updated cutting path against the bone profile using a suitable technique such as, without limitation, Iterative Optimization Algorithms, ray tracing, FEA (Finite Element Analysis), etc.
Though the deletion of the target region and scaling are described separately, it is possible that the cutting path is updated using both of these options. In an embodiment, the processing unit 160 is configured to update the cutting path based on both the scaling factor and the target region. For example, the surgeon may choose to scale the current cutting path and then delete the target region of the scaled current cutting path, and provides respective inputs in the update request via the user interface 400. The processing unit 160 then generates an updated cutting path in a similar manner as described earlier.
The processing unit 160 tracks, with the help of the tracking system 162, the real-time progress of the cutting operation performed by the robot 164. In an embodiment, the processing unit 160 displays the real-time position of the cutting tool 164c, the bone profile and the updated cutting path on a user interface presented on the display 152a so that the surgeon can visually track the entire cutting operation. The computing system 150 allows the surgeon to pause the cutting operation at any time during the cutting operation if the surgeon feels a need to modify the cutting path. For example, the aforesaid user interface may include a ‘pause’ button. When the surgeon clicks on the ‘pause’ button, the processing unit 160 instructs the robot 164 to pause the cutting operation and may display the user interface 400 to the user, allowing the user to modify the current cutting path. Thus, the computing system 150 facilitates the surgeon to dynamically modify not just the pre-planned cutting path but also a cutting path at any stage of the cutting operation, thereby enhancing the effectiveness and accuracy of the cutting operation, and increasing patient safety.
Figs. 4a-4b illustrate an exemplary user interface 400 and an exemplary user interface 450, respectively, according to an embodiment. The user interface 400 is displayed on the display 152a and enables the user to interact with the computing system 150 to modify the current cutting path intraoperatively based upon the patient’s anatomy.
In an embodiment, the user interface 400 includes various components such as a first image 402, a second image 404, a first update section 406, and a second update section 408. The processing unit 160 is configured to generate the user interface 400 as well as various components, such as, the first image 402, the second image 404, the first update section 406, and the second update section 408, various user interface elements of the user interface 400, various other user interfaces (e.g., the user interfaces 110, 450), various user interface elements as described in the present disclosure. The display 152a is configured to display the user interfaces (e.g., the user interfaces 110, 400, 450) along with their components and the user interface elements to the user.
In an embodiment, the first image 402 includes a visual representation of the current cutting path 402a. In an embodiment, the current cutting path 402a is the pre-planned cutting path. In another embodiment, the current cutting path 402a is the cutting path with latest updates. In an embodiment, the current cutting path 402a is displayed along with a corresponding cutting plane defined, for example, by x-axis and y-axis.
The first update section 406 provides various options related to deleting a target region of the current cutting path 402a. For example, the first update section 406 includes a first window 406a and a second window 406b enabling the user to define the target region using a predefined shape and a freehand shape, respectively, the details of which have been explained above in context to Figs. 1a-1b and 2a-2g. The second update section 408 provides various options related to scaling the current cutting path 402a. The details of the second update section 408 have been explained above in context to Figs. 1a – 1b and Figs. 3a-3b. In an embodiment, the second update section 408 includes a section 408c (also, referred to as a parameter section 408c). The parameter section 408c displays one or more parameters related to the cutting operation. In an embodiment, the parameter section 408c displays one or more of a cutting speed, a cutting depth, the scaling factor entered by the user, the cutting length of the updated cutting path, the cutting area of the updated cutting path and so forth. This facilitates the user to keep a check on the overall cutting operation.
Optionally, or in addition, the processing unit 160 is configured to receive a confirmation input. The confirmation input indicates that the user has verified the one or more parameters related to the cutting operation displayed in the section 408c. Accordingly, the user interface 400 includes a user element 408d, such as a button as shown in Fig. 4a. The processing unit 160 receives the confirmation input in response to the actuation of the user element 408d (e.g., clicking on the button) and is configured to finalize the one or more parameters to update the current cutting path 402a.
The user interface 400 includes a second image 404. The second image 404 includes a visual representation of an updated cutting path 404a. The updated cutting path 404a is generated based upon the inputs from the user as explained earlier. In an embodiment, the computing system 150 allows the user to compare the current cutting path 402a and the updated cutting path 404a. For example, the user interface 400 includes a user element 412, for example, a check box (as shown in Fig. 4a), a button or the like. When the user actuates the user element 412 (e.g., by checking the check box), the processing unit 160 displays the current cutting path 402a overlaid on the updated cutting path 404a in the second image 404. In an embodiment, the computing system 150 also allows the user to visualize the bone profile (e.g., a 3D bone profile or a 2D bone profile) of the target bone along with the updated cutting paths 404a. Accordingly, the user interface 400 includes, for example, a user element 414, such as a check box (as depicted in Fig. 4a), a button or the like. When the user actuates the user element 414 (e.g., by checking the check box), the processing unit 160 modifies the second image 404 to include the bone profile 414a overlaid on the updated cutting path 404a. In the depicted example, the bone profile 414a is a 2D bone profile. It is possible that the user may choose to actuate both user elements 412 and 414 to view the current cutting path 402a, the updated cutting path 404a and the bone profile 414a simultaneously. In this case, the processing unit 160 modifies the second image 404 to display these overlaid on each other as depicted in Fig. 4. This helps the user to visualize and assess both the current and updated cutting paths 402a, 404a with respect to the patient’s anatomy (represented by the bone profile 414a), for determining whether the updated cutting path 404a meets the requirements or any further modifications are needed.
The computing system 150 also, optionally, allows the user to undo any modifications made in the current cutting path (such as, the current cutting path 402a). According, in an embodiment, the user interface 400 includes a user element 410, such as, a button, though other forms of user elements may be used. The processing unit 160 is configured to undo changes to the current cutting path 402a and reverts to the current cutting path 402a when the user element 410 is actuated. Further, the user interface 400 optionally includes a previous button 416a and/or a next button 416b used to navigate through various user interfaces. For example, upon clicking the previous button 416a, the processing unit 160 is configured to navigate to a previous user interface in the surgical guidance while, upon clicking the next button 416b, the processing unit 160 is configured to navigate to a next user interface in the surgical guidance, for example, to the user interface 450.
The processing unit 160 is configured to generate the user interface 450 and its various components upon receiving an actuation via the next button 416b of the user interface 400. In an embodiment, the user interface 450 includes various components such as a third image 452 and one or more user elements. The third image 452 includes a visual representation of the updated cutting path 404a. The updated cutting path 404a is generated based upon the inputs received from the user via various user elements of the user interface 400. The user interface 450 includes a user element 454 which upon actuation, allows the user to visualize the updated cutting path 404a along with a bone profile 454a (for example, 2D or 3D bone profile) of the target bone. For example, the user element 454 may include a checkbox, a button or the like. This enables the user to visualize the updated cutting path 404a over the bone profile and make adjustments in real time accordingly. Further, the user interface 450 includes a user element 456 (for example, a checkbox, a button, etc.) which upon actuation allows the user to visualize the current cutting path 402a. When the user actuates the user element 456 (for example, checking the checkbox or clicking on the button), the processing unit 160 modifies the third image 452 to include the current cutting path 402a overlaid on the updated cutting path 404a. This enables the user to compare the changes made in the updated cutting path 404a over the current cutting path 402a.
In an embodiment, the computing system 150 enables the user to trigger a verification process. For example, the user interface 450 includes a user element 458, such as a button (as shown in Fig. 4b), though other user elements may be used, to provide this trigger. In response to actuation of the user element 458, the processing unit 160 verifies the updated cutting path 404a in a manner as described earlier. Further, the user interface 450 optionally includes a previous button 460a used to navigate to a previous user interface (for example, the user interface 400) in the surgical guidance. The user interface 450 also includes a proceed button 460b to finalize the updated cutting path 404a. For example, upon clicking the proceed button 460b, the processing unit 160 is configured to send the updated cutting path 404a to the robot 164.
Thus, the user interfaces 400, 450 facilitate the user to access various options and/or functionalities with respect to updating the current cutting path via a single user interface. It is possible, however, that different options and/or functionalities are made accessible to user via multiple user interfaces. For example, the first and second update sections 406, 408 and the first and second images 402, 404 may be presented to the user in separate user interfaces. In another example, the first window 406a and the second window 406b may be provided in separate user interfaces. Other variations of the user interfaces are also possible without deviating from the scope of the present disclosure. Further, it should be appreciated that the user interface 110 explained herein may be optional. Accordingly, in one embodiment, once the current cutting path is retrieved, the processing unit 160 directly presents the user interface 400. Further, in an embodiment, the user elements 454, 456, 458 and the proceed button 460b may be provided on the user interface 400 and the processing unit does not display the user interface 450 in this case.
Fig. 5 depicts a flowchart of a method 500 for dynamically altering (or adjusting) a cutting path of a bone during a surgical procedure, according to an embodiment. The method 500 is performed intra-operatively, i.e., during a surgical procedure. At step 502, a current cutting path is retrieved by the computing system 150. The current cutting path includes a plurality of points in a cutting plane. Each point in the current cutting path is associated with coordinates in a two-dimensional space. In other words, each point is defined by the coordinates in the two-dimensional space. In an embodiment, the current cutting path includes the pre-planned cutting path defined in a pre-operative planning stage. Various embodiments for retrieving the current cutting path and the pre-planned cutting path have been described earlier. Fig. 2a represents an exemplary current cutting path 200.
At step 504, the computing system 150 receives an update request to update the current cutting path. In an embodiment, the update request includes a region (or a target region) to be deleted from the current cutting path. In an embodiment, the target region has a predefined shape. Various examples of the predefined shape have been outlined earlier. In another embodiment, the target region has a freehand shape. Various embodiments related to receiving the update request including the target region are explained earlier. Fig. 2b represents an exemplary target region 204 in the update request, wherein the target region 204 is defined by an ellipse 202.
At step 506, the computing system 150 generates an updated cutting path based upon the target region. In an embodiment, the computing system 150 generates the updated cutting path using steps 506a – 506d as explained below. At step 506a, the computing system 150 determines a boundary of the target region. The boundary of the target region includes a plurality of boundary points associated with respective coordinates in the two-dimensional space. The computing system 150 determines the boundary and the coordinates of the boundary points in a similar manner as explained earlier. Fig. 2c represents exemplary boundary points B1 – B8 on the boundary 204a of the target region 204.
At step 506b, the computing system 150 identifies a plurality of inner points on the current cutting path. The inner points correspond to, or are, a subset of the plurality of points of the current cutting path that are located inside the boundary of the target region. The computing system 150 identifies the inner points in a similar manner as explained earlier. Fig. 2c represents exemplary inner points C1-C8.
At step 506c, the computing system 150 determines a plurality of intersection points. This involves identifying the points where the boundary of the target region intersects with the current cutting path. In other words, the intersection points are a subset of the boundary points intersecting with (i.e., also located on) the current cutting path. The computing system 150 identifies the intersection points in a similar manner as described earlier. Fig. 2d visually depicts exemplary intersection points D1 – D6.
At step 506d, the computing system 150 generates the updated cutting path based on the inner points, the boundary points and the intersection points. Fig. 2e depicts an exemplary updated cutting path 206. In an embodiment, the computing system 150 is configured to determine at least one pair of intersection points. Each pair of intersection points includes a first intersection point and a second intersection point adjacently disposed according to the forward direction of the current cutting path. The first intersection point and the second intersection points are defined such that a section of the current cutting path initiates at the first intersection point and terminates at the second intersection point. For example, in the pair of intersection points D1-D2, D1 is the first intersection point and D2 is the second intersection point as shown in Fig. 2d. For each pair of intersection points, the computing system 150 identifies a plurality of first inner points and a plurality of first boundary points in a similar manner as described earlier. Further, the computing system 150 replaces the first inner points of the current cutting path with the first boundary points to generate a section of an updated cutting path. Repeating this process for all pairs of intersection points results in the updated cutting path. Fig. 2e depicts an exemplary updated cutting path 206 generated accordingly.
At optional step 508, the computing system 150 verifies the updated cutting path in a similar manner as described earlier. In an embodiment, the computing system 150 verifies the updated cutting path against a bone profile of the patient (either two-dimensional or three-dimensional). In an embodiment, the computing system 150 provides a user interface which enables the user to view a simulation of cutting the target bone according to the updated cutting path. For example, the computing system 150 displays the updated cutting path on a surface of the three-dimensional view of the target bone of the patient. This allows the user to visually verify the updated cutting path designed by the user thus, reducing the chances of incorrect cuts on the bone and/or the surrounding tissue.
At step 510, the computing system 150 sends the updated cutting path to the robot 164. For example, the computing system 150 sends the updated cutting path in a file to the robot 164. The robot 164 controls the cutting tool 164c coupled to the robotic arm 164b based upon the updated cutting path to cut a portion of the target bone.
Fig. 6 illustrates a flowchart of a method 600 for dynamically altering (or adjusting) a cutting path of a bone during a surgical procedure, according to an embodiment. The method 600 is performed intra-operatively, i.e., during a surgical procedure.
At step 602, a current cutting path is retrieved by the computing system 150. In an embodiment, the current cutting path includes the pre-planned cutting path. Various embodiments for retrieving the current cutting path and the pre-planned cutting path have been described earlier. Fig. 3a represents an exemplary current cutting path 312.
At step 604, the computing system 150 receives an update request to update the current cutting path. In an embodiment, the update request includes a scaling factor to scale the current cutting path. The current cutting path is either scaled up or scaled down based on the scaling factor received from the user. Various embodiments related to receiving the update request including the scaling factor are explained earlier. The scaling factor may be greater than one or less than one.
At step 606, the computing system 150 generates an updated cutting path based upon the scaling factor. In an embodiment, the computing system computes generates the updated cutting path by computing updated values for the coordinates for each of the plurality of points of the current cutting path based upon the scaling factor. For example, the computing system 150 is configured to multiply the coordinate values of each point with the scaling factor to compute the respective updated coordinate values. Further details related to scaling a current cutting path have been explained earlier. Fig. 3a and 3b depict examples for a scaling factor less than and greater than one, respectively.
At optional step 608, the computing system 150 verifies the updated cutting path in a similar manner as described earlier.
At step 610, the computing system 150 sends the updated cutting path to the robot 164, for example, in a file. The robot 164 controls the cutting tool 164c coupled to the robotic arm 164b based upon the updated cutting path to cut a portion of the target bone.
As explained earlier, it is possible that the surgeon may desired to modify the current cutting path by scaling and deleting a target region. In this case, the steps 604 and 606 of the method 600 may be performed prior to the step 504 of the method 500. A scaled current path generated at the step 606 may then be considered as the current path for the steps 504 onwards.
The present disclosure will now be explained with the help of following examples:
Example 1: A knee arthroplasty was performed using a conventional system.
A conventional robotic surgical system was employed to perform a knee arthroplasty. In such a system, a pre-planned cutting path 702 is uploaded in advance, and the surgical robot is configured to execute the surgery based on this cutting path. The pre-planned cutting path 702 is generated by the user prior to surgery according to the patient’s anatomy. As illustrated in Fig. 7a, the pre-planned cutting path 702 was shown overlaid on a target portion of the bone (the femur in this example), the target portion being defined by boundary 700. However, during the procedure, the surgeon observed that the pre-planned cutting path 702 did not fully cover the target portion and omitted a significant area of the target portion to be resected. Then, the surgeon manually trimmed the remaining region of the target portion. This manual adjustment resulted in an imprecise bone cut, leading to a poor fit of the implant and subsequent discomfort for the patient. It also significantly increased the surgery time since the surgeon had to trim a large portion of the bone manually.
Example 2: A knee arthroplasty was performed using the system 100 of the present disclosure.
The system 100 was utilized to perform a knee arthroplasty procedure. A pre-planned cutting path 702 is uploaded to the system 100 prior to surgery. During the operation, the surgeon observed that the pre-planned cutting path 702 did not entirely cover the target portion defined by the boundary 700 (similar to a case illustrated in Fig. 7a). In this case, the user intraoperatively updated the pre-planned cutting path 702. As shown in Fig. 7b, the pre-planned cutting path 702 was scaled-up by a predefined factor to generate an updated cutting path 704. The updated cutting path 704 fully encompassed the target portion within the boundary 700. The updated cutting path 704 was sent to the surgical robot, which controlled a cutting tool to resect the bone according to the updated cutting path 704. Since the bone was resected using the surgical robot, the cut was precise, smooth and accurate, leading to an optimal fit of the implant. Thus, the system 100 enabled dynamic intraoperative adjustment of the cutting path, thereby eliminating the possibility of human error associated with manual trimming of the bone and ensuring improved accuracy of the cut. It also reduced the surgery time, making the overall process more efficient.
Example 3: A knee arthroplasty was performed using the conventional robotic surgical system.
The knee arthroplasty was carried out using a conventional robotic surgical system. In such a system, a pre-planned cutting path 706, defined by the user, was provided to the robot. However, during the procedure, it was observed that the pre-planned cutting path 706 overlapped with a region 708 that included the posterior cruciate ligament (PCL), as illustrated in Fig. 7c. Accordingly, executing the robotic surgery based on the pre-planned cutting path 706 posed a risk of damaging the PCL. In such circumstances, the surgeon carefully observed the cutting operation of the robot and paused the cutting operation before it would enter within the region 708. The position of the robot was adjusted such that the end effector was away from the region 708 and the cutting operation was resumed. Since there was a significant overlap between the region 708 and the pre-planned cutting path 706, this pause and restart of the cutting operation had to be done frequently and carefully. This process not only prolonged the overall surgery but also introduced additional complexities, thereby causing inconvenience to the patient.
Example 4: A knee arthroplasty was performed using the system 100 of the present disclosure.
The system 100 was employed to perform a knee arthroplasty. The system 100 received a pre-planned cutting path 706 prior to the procedure. During surgery, it was identified that the pre-planned cutting path 706 overlapped with a region 708 that included the posterior cruciate ligament (PCL), similar to a case illustrated in Fig. 7c. Since the region 708 included the posterior cruciate ligament (PCL), executing the surgery based on the pre-planned cutting path 706 could potentially damage the PCL. To address this, the user intraoperatively updated the pre-planned cutting path 706 by deleting the portion of the pre-planned cutting path 706 overlapping the region 708. The system 100 provided multiple options for selecting the region 708. The surgeon chose the shape 710, as shown in Fig. 7d. Based on this selection, the system 100 generated an updated cutting path 712 by removing the overlapping portion of the pre-planned cutting path 706. The updated cutting path 712 was sent to the surgical robot, and the cutting operation was performed according to the updated cutting path 704. Since the region 708 was avoided by the updated cutting path 712 itself, the cutting operation was not paused. Thus, the system 100 ensured patient safety by preventing damage to the soft tissues, while also reducing overall surgery time and thereby contributing to improved clinical outcomes.
The scope of the invention is only limited by the appended patent claims. More generally, those skilled in the art will readily appreciate that all parameters, dimensions, materials, and configurations described herein are meant to be exemplary and that the actual parameters, dimensions, materials, and/or configurations will depend upon the specific application or applications for which the teachings of the present invention is/are used. , C , Claims:WE CLAIM
1. A method for dynamically adjusting a cutting path for a bone during a surgical procedure, the method (500) comprising:
a. retrieving, by a computing system (150), a current cutting path for cutting a bone, the cutting path comprising a plurality of points in a cutting plane, each point associated with coordinates in a two-dimensional space;
b. receiving intra-operatively, by the computing system (150), an update request to update the current cutting path, the update request comprising a target region to be deleted from the current cutting path;
c. generating, by the computing system (150), an updated cutting path based upon the target region, wherein generating the updated cutting path comprises:
i. determining, by the computing system (150), a boundary of the target region comprising a plurality of boundary points, each boundary point associated with coordinates in the two-dimensional space;
ii. identifying, by the computing system (150), a plurality of inner points of the current cutting path based upon the boundary of the target region, the inner points corresponding to a subset of the plurality points that are located inside the boundary of the target region;
iii. identifying, by the computing system (150), a plurality of intersection points, the plurality of intersection points corresponding to a subset of the boundary points intersecting with the current cutting path; and
iv. generating, by the computing system (150) the updated cutting path based upon the boundary points, the intersection points and the inner points; and
d. sending, by the computing system (150), the updated cutting path to a robot (164) configured to cut the bone according to the updated cutting path.
2. The method as claimed in claim 1, wherein the plurality of intersection points comprises at least one pair of intersection points, each pair of intersection points comprising a first intersection point and a second intersection point adjacently disposed according to a forward direction of the current cutting path, wherein the step of generating the updated cutting path based upon the boundary points, the intersection points and the inner points comprises, for each pair of intersection points:
a. identifying, by the computing system (150), a plurality of first inner points of the plurality of inner points, the plurality of first inner points lying on the current cutting path and extending from the first intersection point and the second intersection point;
b. identifying, by the computing system (150), a plurality of first boundary points of the plurality of boundary points, the plurality of first boundary points lying on the boundary of the target region and extending from the first intersection point of the intersection point and the second intersection point; and
c. replacing, by the computing system (150), the plurality of first inner points with the plurality of first boundary points.
3. The method as claimed in claim 1, wherein the target region has a predefined shape comprising one of: a circle, an ellipse, a polygon, a rectangle, or a square.
4. The method as claimed in claim 1, wherein the target region has a freehand shape.
5. The method as claimed in claim 1, wherein the update request comprises a scaling factor, wherein the step of generating the updated cutting path comprises computing updated values for the coordinates of each of the plurality of points of the current cutting path based upon the scaling factor.
6. The method as claimed in claim 1, wherein the current cutting path comprises a pre- planned cutting path defined in a pre-operative planning stage.
7. The method as claimed in claim 1, wherein the method comprises verifying, by the computing system (150), the updated cutting path against a three-dimensional bone profile of the patient.
8. The method as claimed in claim 1, wherein the method comprises:
a. generating, by the computing system (150) a second image (404) comprising the updated cutting path; and
b. displaying, by a display (152a), a user interface (400) comprising the second image (404).
9. The method as claimed in claim 8, wherein the step of generating the second image (404) comprises generating the second image (404) comprising at least one of: the current cutting path and a bone profile (414a) overlaid on the updated cutting path.
10. A computing system (150) for dynamically adjusting a cutting path for a bone during a surgical procedure, the computing system (150) comprising:
a. a processing unit (160) comprising one or more processors (160a); and
b. a memory (158) coupled to the one or more processors (160a) and storing a set of instructions, that when executed by the one or more processors (160a) causes the processing unit (160) to:
i. retrieve a current cutting path for cutting a bone, the cutting path comprising a plurality of points in a cutting plane, each point associated with coordinates in a two-dimensional space;
ii. receive, intra-operatively, an update request to update the current cutting path, the update request comprising a target region to be deleted from the current cutting path;
iii. determine a boundary of the target region comprising a plurality of boundary points, each boundary point associated with coordinates in the two-dimensional space;
iv. identify a plurality of inner points of the current cutting path based upon the boundary of the target region, the inner points corresponding to a subset of the plurality points that are located inside the boundary of the target region;
v. identify a plurality of intersection points, the plurality of intersection points corresponding to a subset of the boundary points intersecting with the current cutting path; and
vi. generate an updated cutting path based upon the boundary points, the intersection points and the inner points; and
vii. send the updated cutting path to a robot (164) configured to cut the bone according to the updated cutting path.
11. The computing system (150) as claimed in claim 10, wherein the plurality of intersection points comprises at least one pair of intersection points, each pair of intersection points comprising a first intersection point and a second intersection point adjacently disposed according to a forward direction of the current cutting path, wherein the set of instructions, when executed by the one or more processors (160a), cause the processing unit (160), for each pair of intersection points:
a. identify a plurality of first inner points of the plurality of inner points, the plurality of first inner points lying on the current cutting path and extending from the first intersection point and the second intersection point;
b. identify a plurality of first boundary points of the plurality of boundary points, the plurality of first boundary points lying on the boundary of the target region and extending from the first intersection point of the intersection point and the second intersection point; and
c. replace the plurality of first inner points with the plurality of first boundary points.
12. The computing system (150) as claimed in claim 10, wherein the target region has a predefined shape comprising one of: a circle, an ellipse, a polygon, a rectangle, or a square.
13. The computing system (150) as claimed in claim 10, wherein the target region has a freehand shape.
14. The computing system (150) as claimed in claim 10, wherein the update request comprises a scaling factor, wherein the set of instructions, when executed by the one or more processors (160a), cause the processing unit (160) to compute updated values of the coordinates of each of the plurality of points of the current cutting path based upon the scaling factor to generate the updated cutting path.
15. The computing system (150) as claimed in claim 10, wherein the current cutting path comprises a pre- planned cutting path defined in a pre-operative planning stage.
16. The computing system (150) as claimed in claim 10, wherein the set of instructions, when executed by the one or more processors (160a), cause the processing unit (160) to: generate a second image (404) comprising the updated cutting path, wherein the computing system (150) comprises a display (152a), coupled to the processing unit (160) and configured to display a user interface (400) comprising the second image (404).
17. The computing system (150) as claimed in claim 16, wherein the set of instructions, when executed by the one or more processors (160a), causes the processing unit (160) to: generate the second image (404) comprising at least one of: the current cutting path and a bone profile (414a) overlaid on the updated cutting path.
18. The computing system (150) as claimed in claim 10, wherein the set of instructions, when executed by the one or more processors (160a), causes the processing unit (160) to verify the updated cutting path against a three-dimensional bone profile of the patient.
| # | Name | Date |
|---|---|---|
| 1 | 202521092972-STATEMENT OF UNDERTAKING (FORM 3) [27-09-2025(online)].pdf | 2025-09-27 |
| 2 | 202521092972-REQUEST FOR EXAMINATION (FORM-18) [27-09-2025(online)].pdf | 2025-09-27 |
| 3 | 202521092972-REQUEST FOR EARLY PUBLICATION(FORM-9) [27-09-2025(online)].pdf | 2025-09-27 |
| 4 | 202521092972-FORM-9 [27-09-2025(online)].pdf | 2025-09-27 |
| 5 | 202521092972-FORM FOR SMALL ENTITY(FORM-28) [27-09-2025(online)].pdf | 2025-09-27 |
| 6 | 202521092972-FORM FOR SMALL ENTITY [27-09-2025(online)].pdf | 2025-09-27 |
| 7 | 202521092972-FORM 18 [27-09-2025(online)].pdf | 2025-09-27 |
| 8 | 202521092972-FORM 1 [27-09-2025(online)].pdf | 2025-09-27 |
| 9 | 202521092972-FIGURE OF ABSTRACT [27-09-2025(online)].pdf | 2025-09-27 |
| 10 | 202521092972-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [27-09-2025(online)].pdf | 2025-09-27 |
| 11 | 202521092972-EVIDENCE FOR REGISTRATION UNDER SSI [27-09-2025(online)].pdf | 2025-09-27 |
| 12 | 202521092972-DRAWINGS [27-09-2025(online)].pdf | 2025-09-27 |
| 13 | 202521092972-DECLARATION OF INVENTORSHIP (FORM 5) [27-09-2025(online)].pdf | 2025-09-27 |
| 14 | 202521092972-COMPLETE SPECIFICATION [27-09-2025(online)].pdf | 2025-09-27 |
| 15 | Abstract.jpg | 2025-10-09 |
| 16 | 202521092972-FORM-26 [10-11-2025(online)].pdf | 2025-11-10 |
| 17 | 202521092972-Proof of Right [20-11-2025(online)].pdf | 2025-11-20 |