Sign In to Follow Application
View All Documents & Correspondence

Method And System For Mapping Master Controller Movements To Surgical Instrument Tool Tip In Surgical Robots

Abstract: METHOD AND SYSTEM FOR MAPPING MASTER CONTROLLER MOVEMENTS TO SURGICAL INSTRUMENT TOOL TIP IN SURGICAL ROBOTS ABSTRACT A robotic surgical system comprises a surgeon master console, a master controller, a surgical unit, one or more surgical arms, surgical instrument tool and camera. The master controller is configured to receive movements from a surgeon using input controller in a master controller frame. The movements are translated from the master controller frame to movements in a display unit frame. Further, the movements of the display unit frame are translated to movements in a camera frame and subsequently to a world frame. Next, the world frame movements are translated to instrument arm frame and subsequently incremental positional shift of the instrument tool tip is calculated to determine a new position. The new position is then transmitted as control commands to one or more motors controlling the surgical instrument tool and move the tool tip to the new position. Rotational matrices are defined and utilized for translation of movements between frames. FIG. 4

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
31 March 2025
Publication Number
15/2025
Publication Type
INA
Invention Field
BIO-MEDICAL ENGINEERING
Status
Email
Parent Application

Applicants

Merai Newage Private Limited
Survey No. 1574, Bilakhia House, Chala, Muktanand Marg, Vapi, Valsad 396191, Gujarat, India

Inventors

1. VATSA, Abhishek
Near Kamta Sakhi Math, Prabhunath Nagar, Chhapra - 841301, Bihar, India
2. MARELLA, Rajeev Reddy
3-296/B/1, Srinagar Colony, Kodada - 508206, Telangana, India
3. TAKKALLA, Bhanu Prakash Reddy
H: No. 1-71, Rameshwarpally, Bhiknoor, Kamreddy - 503101, Telangana, India

Specification

Description:TECHNICAL FIELD
The present disclosure relates generally to the field of robotic surgical systems and, more particularly, to a method and system for mapping master controller movements to surgical instrument tool tip in surgical robots.
BACKGROUND
Robotic surgical systems have revolutionized modern surgical procedures by enabling minimally invasive approaches with enhanced precision and control. The robotic surgical systems typically employ sophisticated robotic arms that manipulate specialized surgical instruments through small incisions in the patient's body. The success of such procedures heavily relies on the precise control and positioning of these surgical instruments, which must operate with exceptional accuracy within confined anatomical spaces.
In the realm of surgical robotics, a significant challenge arises from the necessity to accurately translate a surgeon's movements on a master controller to the corresponding actions of a robotic instrument tip. In current robotic surgical systems, during a medical procedure, the robotic surgical system is controlled by a surgeon using a console comprising a controller. The console is interfaced with a user interface typically a display unit. The display unit allows the surgeon to view and maneuver a robotic surgical instrument using the controller, to act on a patient in a surgical site. The visualization of the surgical site is provided by one or more cameras, that provide real-time images of the surgical site. It becomes very crucial for proper manoeuvring of the surgical instrument using the controller, based on the visualizations provided by the camera, for the successful completion of the medical procedure.
The core challenge, therefore, lies in mapping/translating a surgeon’s movements on the controller to the movement of the robotic surgical instrument. Existing approaches frequently suffer from limitations related to the complexity of integrating multiple coordinate frames. For example, the image as viewed by the surgeon on the display unit of the surgeon console might be magnified, thus operating in a different coordinate system wrt to the coordinate from of the hand controller used by the surgeon to perform the surgery. Further, seamless visual feedback is critical for ensuring that the surgeon can maintain situational awareness and control throughout the surgical procedure. Further, dynamic changes to the position of the camera for better view of the surgical site adds to the complexity and hinders smooth operation of the surgical procedure. Thus, there is a need for a system and method to establish a relationship between movements of a controller of a robotic surgical system and the associated surgical instrument as viewed in the image. Thus, it would be advantageous to map the orientation and position of the instrument as displayed to the surgeon with the orientation and position of the surgeon's hand when manoeuvrings the controller. Ensuring accurate mapping between these inputs is crucial for maintaining operational fidelity during surgery.
Therefore, in light of the foregoing discussion, there exists a need to overcome the aforementioned drawbacks of existing surgical systems for mapping master controller movements to surgical instrument tool tip in surgical robots.
SUMMARY
The aim of the present disclosure is to provide a robotic surgical system and method for controlling a surgical instrument tool to enhance precision and control in robotic surgery by translating movements through multiple reference frames. Embodiments of the present disclosure substantially eliminate or at least partially address the aforementioned problems in the prior art and improve the accuracy of tool tip positioning in robotic surgical procedures through a systematic translation of movements across various frames of reference.
One or more objectives of the present disclosure is achieved by the solutions provided in the enclosed independent claims. Advantageous implementations of the present disclosure are further defined in the dependent claims.
In one aspect, the present disclosure provides a robotic surgical system comprising:
a surgeon master console comprising an input control and a display unit;
a surgical unit comprising one or more surgical instrument arms mounted at a first end to the surgical unit, wherein the one or more surgical arms comprises an endoscope or a surgical instrument tool, wherein the surgical instrument tool comprising a tool tip and the endoscope comprises a camera; and
a master controller, wherein the master controller is configured to:
receive movements of the input control in a master controller frame established with reference to the input control;
translate the movements of the input control in the master controller frame to movements of the surgical instrument tool in a display unit frame established with reference to the display unit;
translate the movements captured in the display unit frame to movements in a camera frame established with reference to camera of the endoscope;
translate the movements in each of the camera frame to movements in a world frame established with reference to a surgical workspace;
translate the movements in the world frame to movements in a surgical instrument arm frame established at the first end of the one or more surgical instrument arms;
calculate a new position of the tool tip in the instrument arm frame; and
transmit control commands to one or more motors controlling the surgical instrument tool to move the tool tip to the new position.
In another aspect, the present disclosure provides a method for controlling a surgical instrument tool of a surgical robot with a master controller comprising:
receiving, by the master controller, movements of an input controller associated with a surgeon master console;
establishing, by the master controller, the movements of the input control in a master controller frame with reference to the position of the input control;
translating, by the master controller, the movements of the input control in the master controller frame to movements of the surgical instrument tool in a display unit frame established with reference to a display unit associated with the surgeon master console;
translating, by the master controller, the movements captured in the display unit frame to movements in a camera frame established with reference to the one or more cameras associated with one or more surgical arms;
translating, by the master controller, the movements in the camera frame to movements in a world frame established with reference to a surgical workspace;
translating, by the master controller, the movements in the world frame to movements in a surgical instrument arm frame established at the first end of the one or more surgical instrument arms; wherein the surgical instrument arm is coupled to a surgical unit at the first end;
calculating, by the master controller, a new position of a tool tip in the surgical instrument arm frame; and
transmitting, by the master controller, control commands to one or more motors controlling a surgical instrument tool to move the tool tip to the new position.
The present disclosure provides a comprehensive robotic surgical system and method that significantly enhances the precision and control of surgical instruments through a sophisticated mapping of movements from a surgeon's master console to the robotic tool tip. By establishing multiple coordinate frames—namely the master controller frame, display unit frame, camera frame, world frame, surgical instrument arm frame and instrument tip frame—the system ensures that every movement made by the surgeon is accurately translated into corresponding movements of the surgical tool, thereby maintaining operational fidelity during procedures. The integration of rotation matrices facilitates seamless transformations between these frames, ensuring numerical stability and computational efficiency, which are critical in real-time surgical environments. The value of the respective rotation matrix will be different based on an initial frame assignment and references assumed in calculations. This mapping process not only allows for intuitive control of the surgical instruments but also provides enhanced visual feedback through the display unit and camera, allowing surgeons to operate with greater confidence and precision. The ability to calculate the new position of the tool tip in the instrument arm frame and transmit control commands to the motors controlling the surgical tool ensures that the robotic system responds promptly and accurately to the surgeon's inputs, thereby reducing the risk of errors and improving patient outcomes. Each feature of the system, from the establishment of coordinate frames to the use of rotation matrices and real-time command transmission, synergistically contributes to a robust framework that elevates the capabilities of robotic surgery, making it a transformative approach in minimally invasive procedures.
The framework incorporates six essential coordinate frames, including Master Controller Frame (FM) - the reference frame for the surgeon’s input, Display Unit Frame (FD) - the frame of reference for the 3D display, Camera Frame (FC) - the frame representing the surgical camera’s perspective, World Frame (FW) - a global reference frame for positioning, Instrument Arm Frame (FI ) - the frame representing the base of Instrument Arm and Instrument Tip Frame (FT ) - the frame for the instrument’s tip. The framework operates by translating the surgeon's inputs from the Master Controller Frame (FM) to the Instrument Tip Frame (FT) through a series of coordinate transformations. It utilizes rotation matrices and their transposes to maintain numerical stability and computational efficiency during these transformations. The system updates the position of the instrument's tool tip in the Instrument Arm Frame (FI) by incorporating incremental movements since the last update. This process ensures that the movements of the instrument tool tip are precise and intuitive, aligning with the perspective of the surgical camera frame (FC). By employing six essential coordinate frames, the framework facilitates real-time feedback and accurate control of the robotic surgical system. This approach is employed to enhance surgical precision and reduce errors during robotic-assisted procedures, ensuring that the surgeon's intentions are accurately reflected in the instrument's movements. The precise transformation between the various coordinate frames results in improved real-time instrument control, which enhances the overall efficiency and effectiveness of the robotic surgical system.
The term "master controller frame" refers to the coordinate system associated with the input control of the master console, which govern the overall operation and movement of the surgical instruments. The term "display unit frame" refers to the coordinate system associated with the display unit, which presents visual information to the user regarding the surgical environment and instrument positioning. The term "camera frame" refers to the coordinate system established by the camera used to capture visual data within the surgical environment. The term "world frame" refers to a fixed reference coordinate system that serves as a baseline for spatial orientation and positioning of all elements within the surgical workspace. The term "surgical instrument arm frame" refers to the coordinate system associated with the robotic arm that manipulates the surgical instrument tool, allowing for precise control and positioning. The term “instrument tip frame” refers to the coordinate system associated with the tip of the surgical instrument and associated with the incremental positional changes to the instrument tip.
Initially, the master controller captures the surgeon's input in the Master controller Frame (FM), which is then referenced against the Display Unit Frame (FD) to ensure accurate visualization. The Camera Frame (FC) provides a perspective that aligns the surgeon's input with the surgical environment. The World Frame (FW) serves as a global reference, allowing for consistent positioning across different frames. The Instrument Arm Frame (FI) and Instrument Tip Frame (FT) are utilized to ensure that the movements of the surgical instrument tool are accurately reflected at the tool tip.
In accordance with an embodiment, a surgeon master console comprising an input control/handle and a display unit. Throughout the present disclosure, the term "surgeon master console" refers to a centralized control interface utilized by a surgeon to manipulate robotic instruments and visualize surgical procedures. The term "input control" refers to a device or mechanism that allows the surgeon to provide commands and control the movement of robotic instruments during surgery. Examples of input control device include trackball, joysticks, handle etc. The term "display unit" refers to a visual interface/screen that presents real-time images, data, and feedback to the surgeon, facilitating informed decision-making during the surgical process.
Throughout the present disclosure, the term "master control" refers to a centralized processing unit that orchestrates the operation of various components within the robotic surgical system. The master control is generally an industrial PC or single board PC in accordance with an embodiment of the invention. Throughout the present disclosure “master controller” and “hand controller/main controller/master manipulator/input controller” are analogous and is configured to perform the function of receiving surgeon movements of input control while performing a surgery. The master controller receives input control movements within a defined frame, allowing for precise tracking of user commands. It translates these movements into a display unit frame, ensuring that the surgical instrument tool's actions correspond accurately to the user's intentions. Subsequently, the system converts the display unit movements into a camera frame, aligning the visual feedback with the actual tool position. Each camera frame is then transformed into a world frame, which represents the surgical workspace, facilitating a comprehensive understanding of the spatial relationships involved. Finally, the movements are translated into the surgical instrument arm frame, where the tool tip's new position is calculated and control commands are sent to the motors, enabling real-time adjustments to the tool's location. This process is essential for ensuring that the surgical instrument operates in harmony with the surgeon's movements, enhancing precision during procedures. The coordinated translation of movements across multiple frames results in improved accuracy and responsiveness of the surgical instrument, leading to enhanced operational efficiency.
In accordance with an embodiment, a surgical unit comprises one or more surgical instrument arms mounted at a first end to the surgical unit, wherein each one of the one or more surgical arms comprises an endoscope comprising a camera or a surgical instrument tool comprising a tool tip. Throughout the present disclosure, the term "surgical unit" refers to a comprehensive assembly of components designed to facilitate the execution of surgical procedures, encompassing both robotic and manual functionalities. The term “surgical unit” also referred as arm-cart or patient-side cart, is a mobile unit having a base mounted on wheels. The base of the surgical unit includes locking mechanisms for securing the surgical unit. The term "surgical instrument arms" refers to articulated appendages of the robotic system that are equipped to manipulate various surgical instruments with precision and dexterity during operations. The term "camera" refers to an imaging device integrated with the surgical unit via the endoscope, that provides real-time visual feedback to the surgical team, enhancing the ability to monitor and assess the surgical field. The surgical unit includes multiple surgical instrument arms or robotic arms that extend from a first end of the surgical unit. In some implementations, the surgical robotic instrument arms comprises four arms in which three arms are configured for surgical instrument manipulation and one arm is configured for camera or endoscopic imaging.
In accordance with an embodiment, a surgical instrument tool comprises a tool tip coupled with the one or more surgical instrument arms. Throughout the present disclosure, the term "surgical instrument tool" refers to any device or implement utilized in the performance of surgical procedures, including but not limited to cutting, grasping, or suturing tissues or manipulating tissue. The term "tool tip" refers to the distal end of a surgical instrument tool that interacts directly with the tissue or anatomical structure during a surgical operation, often designed to facilitate specific functions such as cutting, cauterizing, or manipulating tissue. The term "control commands" refers to the instructions generated by the master controller to direct the operation of the robotic system and its components. The term "motors" refers to electromechanical devices that convert electrical energy into mechanical motion, facilitating the movement of the surgical instrument and robotic arms.
In accordance with an embodiment, a master controller is configured to: receive movements of the input controller in a master controller frame established with reference to the input controller; translate the movements of the input controller in the master controller frame to movements of the surgical instrument tool in a display unit frame established with reference to the display unit; translate the movements captured in the display unit frame to movements in a camera frame established with reference to each of the one or more camera; translate the movements in each of the camera frame to movements in a world frame established with reference to a surgical workspace; translate the movements in the world frame to movements in a surgical instrument arm frame established at the first end of the one or more surgical instrument arms; calculate a new position of the tool tip in the instrument arm frame; and transmit control commands to one or more motors controlling the surgical instrument tool to move the tool tip to the new position.
Throughout the present disclosure, the term "display rotation matrix" refers to a mathematical representation that describes the orientation and position of a display unit in relation to master controller frame. The value of the rotation matrix will be different based on an initial frame assignment and references assumed in calculations. The term "display unit frame" refers to the coordinate system associated with the display unit, which presents visual information to the user regarding the surgical environment and instrument positioning. The display rotation matrix is utilized to convert the spatial orientation of the input control into corresponding movements of the instrument tool. This process involves mathematical transformations that account for the relative positions and angles between the master controller and the display unit. By applying the rotation matrix, the system ensures that the movements of the input control are accurately reflected in the display unit frame, maintaining the intended trajectory of the surgical instrument. The matrix takes into consideration the three-dimensional coordinates of both the input control and the instrument tool, allowing for precise alignment. This transformation is essential for real-time feedback, enabling the surgeon to visualize the tool's position accurately during the procedure. This approach is employed to enhance the surgeon's control and precision during robotic-assisted surgeries, particularly in complex procedures where spatial awareness is crucial. The application of the display rotation matrix results in improved accuracy in the representation of the instrument tool's movements, leading to more effective surgical interventions
In accordance with an embodiment, a display rotation matrix (RMD) is applied to translate the movements of the input control in the master controller frame to movements of instrument tool in a display unit frame. This step is necessary because the input from the surgeon is initially captured in the Master Controller Frame, while the system’s visual output is presented in the Display Unit Frame. The display rotation matrix is applied to align these two frames. The below equation is used for the alignment.
∆PD = (RMD)T. ∆PM
where: ΔPD is the movement in the Display Unit frame,
ΔPM is the movement in the Master Controller frame,
(RMD) is the display rotation matrix aligning the Master frame (FM) to the Display frame (FD).
The display rotation matrix (RMD) is given by:
(RMD) = (RMD)st. . Ry(θD)
Where:
(RMD)st = [■(0&0&1@0&-1&0@1&0&0)]

Ry(θD) = [■(Cos(θD)&0&-Sin (θD)@0&1&0@Sin (θD)&0&Cos(θD))]
(RMD)st is the default alignment matrix from the Master frame FM to the Display frame FD.
The rotation matrix Ry(θD) represents the tilt of the Display Unit frame about the y-axis by an angle θD. The display rotation matrix is constructed to align the display unit frame with the master controller frame by applying a tilt about the y-axis by an angle θD. By incorporating this rotation, the system aligns the display to the optimal viewing angle, tailored to the surgeon’s perspective. This ensures ergonomic comfort, reduces strain during prolonged use, and improves the overall usability of the system, thereby enhancing the clinician’s interaction with the robotic console.
Once the movements are captured in the Display Unit Frame (FD), the next step is to map them to the Camera Frame (FC), which represents the perspective of the surgical camera. In accordance with an embodiment, a camera rotation matrix is applied to the movements captured in the display unit frame to movements in a camera frame. Throughout the present disclosure, the term "camera rotation matrix" refers to a mathematical representation that describes the orientation of a camera in three-dimensional space, facilitating the transformation of coordinates from the camera's local frame to a reference frame. The value of the rotation matrix will be different based on an initial frame assignment and references assumed in calculations. The camera rotation matrix is utilized to ensure that the movements captured in the display unit frame (FD) are accurately represented in the camera frame (FC). This is necessary because the visual feedback is displayed through the camera, and we need to ensure that the surgeon’s movements are correctly reflected in this frame. The transformation is achieved by applying the below equation:
∆Pc = (RDC)T. ∆PD
Where
(RDC) = is the camera rotation matrix aligning the Display Frame (FD) to the Camera Frame (FC).
The orientation of the camera frame with respect to the display frame is of critical importance while applying the transformation using camera rotation matrix (RDC). If both the frames are aligned in same orientation then the camera rotation matrix (RDC) value is represented by a 3x3 Identity Matrix [ I ]. Since both frames are aligned in the same orientation, there is no need for flipping or additional transformations. The direct mapping of movements allows for a seamless transition from the display to the camera perspective. This alignment ensures that the surgeon's actions are reflected precisely as intended, enhancing the intuitive operation of the robotic system.
(RDC) = I {Identity Matrix}
In case the when the display and camera are oriented in opposite directions, such as when the camera’s view is mirrored or inverted relative to the display, an adjustment need to be applied to the camera rotation matrix to ensure proper alignment between the frames. This adjustment is done by flipping the display frame by 180o about Z-axis. The camera rotation matrix would then have the below value. By compensating for this orientation mismatch, the system ensures that the surgeon’s movements on the display seamlessly correspond to the camera’s perspective, providing intuitive and accurate real-time visual feedback.
(RDC) = [■(-1&0&0@0&-1&0@0&0&1)]
In accordance with an embodiment, the camera rotation matrix is flipped 180o about z-axis based on an input from a surgeon to the master controller using surgeon master console. This functionality is activated via a designated digital input signal, triggered based on the specific requirements of the surgical procedure as determined by the surgeon. By dynamically adapting to the operational demands of each procedure, the system maintains precise coordination between the surgeon’s inputs, the robotic system, and the visual feedback, enhancing both efficiency and safety during surgery. This feature is utilized when the surgeon requires a different viewpoint to enhance visibility or to correct the camera's orientation during surgery. The precise flipping of the camera rotation matrix improves the surgeon's situational awareness, leading to more accurate and effective surgical interventions.
The next critical transformation is to translate the movements from the camera frame FC to the movements in the world frame FW, which serves as a global reference system for positioning the robotic tool tip in the surgical workspace. In accordance with an embodiment, a world frame rotation matrix is applied to the movements in the camera frame to the movements in a world frame. This transformation is represented by the below equation.
∆PW = (RWC) . ∆PC
Where
RWC = (RWC) st.endoscope . Ry(θC) and;
(RWC) st.endoscope is the world frame rotation matrix aligning the camera frame to the world frame for 0o endoscope.
Ry(θC) is the endoscope rotation matrix for an angular endoscope, where θC is angle of endoscope.
The world frame rotation matrix is adjusted using an endoscope rotation matrix to account for angular offset between the camera frame and the world frame. Throughout the present disclosure, the term "world frame rotation matrix" refers to a mathematical representation that describes the orientation of a coordinate system in relation to a fixed global reference frame. The term "endoscope rotation matrix" refers to a mathematical construct that characterizes the rotational position of an endoscope relative to its own local coordinate system. The value of the rotation matrices will be different based on an initial frame assignment and references assumed in calculations. The term "angular offset" refers to the angular difference between two rotational positions, typically expressed in degrees or radians, which may be utilized to align or adjust the orientation of robotic components during surgical procedures. In an embodiment, the angular offset refers to the angular difference between the camera frame orientation and world frame orientation. The adjustment of the world frame rotation matrix is achieved by incorporating the endoscope rotation matrix, which accounts for the angular offset between the camera frame and the world frame. This process involves calculating the rotation matrix Ry(θC) based on the angle θC of the endoscope, which is essential for aligning the visual perspectives. For instance, when the endoscope is positioned at an angle of 30 degrees, then θC = π/6.
In accordance with an embodiment, an instrument arm rotation matrix is applied to the movements in the world frame and translate them to movements in a surgical instrument arm frame. The value of the rotation matrices will be different based on an initial frame assignment and references assumed in calculations. By establishing a surgical instrument arm frame at the first end of the surgical instrument arm, the method maintains a consistent reference for movement. This transformation is essential for accurately reflecting the surgeon's input from the Master Controller Frame (FM) in a global context. This transformation is represented by the below equation. This step ensures that the movements in the World Frame are properly aligned in Instrument Arm Frame, enabling transforming movements to joints.
∆PI = (RWI)T . ∆PW
Where
RWI is the rotation matrix aligning the world frame to the instrument arm frame.
The next and final step is the transformation is that of calculating the new position of the instrument’s tool tip in the Instrument Arm Frame. This transformation is essential for accurately reflecting the surgeon's input from the Master Controller Frame (FM) in a global context. In accordance with an embodiment, the new position is calculated based on current position of the instrument. By utilizing the current position of the instrument, the new position is computed through matrix multiplication, which incorporates both the rotational and translational components of the instrument's movement. The calculation ensures that the orientation and position are accurately represented in the local frame, allowing for precise control and navigation. This method relies on real-time data to adjust the instrument's position dynamically, ensuring that the calculations reflect any changes in the instrument's current state. This approach is employed to ensure that the instrument can adapt to varying operational conditions and maintain accuracy in its positioning. The new position is calculated by the equation
PINew = PICurrent + ∆PI
where
PINew is the new position of the instrument in the Instrument Frame after the movement.
PICurrent is the current position of the instrument in the Instrument Arm Frame before the movement.
∆PI is the incremental movement vector, representing the change in position from the previous state.
An incremental movement vector (∆PI) is computed by determining the current position of the instrument in the Instrument Arm Frame prior to movement. This vector represents the change in position from the previous state, allowing for precise updates to the instrument's tool tip location. The new position is calculated by adding the incremental movement vector to the current position, ensuring that the movements are accurately captured. Once the updated position is established in the Instrument Arm Frame, it is transformed into the Display Unit Frame (FD) for visual feedback. Subsequently, the movements are mapped to the Camera Frame (FC) using a rotation matrix (RD), aligning the surgeon's actions with the camera's perspective. This process is essential to ensure that the movements of the instrument are accurately represented in the visual feedback provided to the surgeon, facilitating effective surgical procedures. The precise calculation and transformation of the incremental movement vector enhance the accuracy of the instrument's positioning, resulting in improved alignment between the surgeon's actions and the visual feedback displayed.
The proposed framework translates the surgeon’s inputs from the Master Controller to precise, intuitive movements of the instrument tool tip, aligned with the camera’s perspective. By using respective rotation matrices and their transposes, the system ensures numerical stability and computational efficiency during coordinate transformations. The World Frame serves as a global reference, providing consistent transformations across local frames (Master, Display, Camera and Instrument arm). This approach enables accurate tool positioning and facilitates integration across system components. In summary, the framework offers an efficient solution for real-time instrument control, improving surgical precision and reducing errors.
It has to be noted that all devices, elements, circuitry, units and means described in the present application could be implemented in the software or hardware elements or any kind of combination thereof. All steps which are performed by the various entities described in the present application as well as the functionalities described to be performed by the various entities are intended to mean that the respective entity is adapted to or configured to perform the respective steps and functionalities. Even if, in the following description of specific embodiments, a specific functionality or step to be performed by external entities is not reflected in the description of a specific detailed element of that entity which performs that specific step or functionality, it should be clear for a skilled person that these methods and functionalities can be implemented in respective software or hardware elements, or any kind of combination thereof. It will be appreciated that features of the present disclosure are susceptible to being combined in various combinations without departing from the scope of the present disclosure as defined by the appended claims.
Additional aspects, advantages, features, and objects of the present disclosure would be made apparent from the drawings and the detailed description of the illustrative implementations construed in conjunction with the appended claims that follow.
BRIEF DESCRIPTION OF THE DRAWINGS
The summary above, as well as the following detailed description of illustrative embodiments, is better understood when read in conjunction with the appended drawings. For the purpose of illustrating the present disclosure, exemplary constructions of the disclosure are shown in the drawings. However, the present disclosure is not limited to specific methods and instrumentalities disclosed herein. Moreover, those in the art will understand that the drawings are not to scale. Wherever possible, like elements have been indicated by identical numbers.
Embodiments of the present disclosure will now be described, by way of example only, with reference to the following diagrams wherein:
FIG. 1 is a diagram illustrating a robotic surgical system, in accordance with an embodiment of the present disclosure.
FIG. 2 is a diagram illustrating a surgeon master console, in accordance with an embodiment of the present disclosure.
FIG. 3A is a diagram illustrating a surgical unit, in accordance with an embodiment of the present disclosure.
FIG. 3B is a diagram illustrating and exploded view of a surgical unit comprising a surgical instrument tool tip, in accordance with an embodiment of the present disclosure.
FIG. 3C is a diagram illustrating and exploded view of a surgical unit comprising an endoscope comprising a camera, in accordance with an embodiment of the present disclosure.
FIG. 4 is a diagram illustrating the master controller frame and the display frame, in accordance with an embodiment of the present disclosure.
FIG. 5 is a diagram illustrating the camera frame and the world frame, in accordance with an embodiment of the present disclosure.
FIG. 6is a diagram illustrating the world frame and the instrument arm frame, in accordance with an embodiment of the present disclosure.
FIG. 7 is a diagram illustrating the instrument tip frame, in accordance with an embodiment of the present disclosure.
In the accompanying drawings, an underlined number is employed to represent an item over which the underlined number is positioned or an item to which the underlined number is adjacent. A non-underlined number relates to an item identified by a line linking the non-underlined number to the item. When a number is non-underlined and accompanied by an associated arrow, the non-underlined number is used to identify a general item at which the arrow is pointing.
DETAILED DESCRIPTION OF EMBODIMENTS
The following detailed description illustrates embodiments of the present disclosure and ways in which they can be implemented. Although some modes of carrying out the present disclosure have been disclosed, those skilled in the art would recognize that other embodiments for carrying out or practicing the present disclosure are also possible.
FIG. 1 is a diagram illustrating a robotic surgical system, in accordance with an embodiment of the present disclosure. With reference to FIG. 1, there is shown a robotic surgical system 100, including a patient-side cart 110, a vision cart 120, and a surgeon console 130.
The surgical unit or arm cart 110 is a mobile unit having a base mounted on wheels. The base includes locking mechanisms for securing surgical unit 110 in position. The surgical unit 110 includes a vertical column extending upward from the base. The vertical column comprises a linear actuator enabling height adjustment. The surgical unit 110 includes multiple robotic arms that extend from the vertical column. In some implementations, the multiple robotic arms include four robotic arms in which three robotic arms 112 are configured for surgical instrument manipulation and one robotic arm 113 is configured for endoscopic imaging. The robotic arms 112 include primary segments, secondary segments, and tertiary segments connected by rotational joints. The rotational joints contain servo motors enabling precise angular positioning. The robotic arms 112 include surgical instrument tool/holders 114 at distal ends. The surgical instrument tool/holders 114 comprise mechanical interfaces and electrical connectors. The mechanical interfaces include spring-loaded clamps for instrument attachment. The electrical connectors transmit power and signals to mounted instruments. The surgical unit 110 further includes at least one surgical instrument tool tip 140 mounted to the surgical instrument tool/holders 114 at one of the robotic arms 112. The surgical instrument tool tip includes elongated shafts with end effectors at distal tips. The end effectors include articulation mechanisms enabling pitch and yaw movements. The surgical robotic instrument tool tip 140 includes internal drive cables connecting to motor units in the instrument holders. The drive cables actuate the end effector movements. The robotic arm 113 supports an endoscopic imaging system. Each of the robotic arms 112 includes additional degrees of freedom for camera positioning. The endoscopic imaging system includes dual high-definition camera sensors mounted at a distal end of the robotic arm 113. The dual camera sensors enable stereoscopic image capture. The endoscopic imaging system includes fibre optic light transmission bundles surrounding the camera sensors for illuminating the surgical field. The endoscopic imaging system enables both white light imaging and near-infrared fluorescence visualization. The endoscopic imaging system comprises glass rod lenses for controlling chromatic aberration and enhancing image quality.
The vision cart 120 is a mobile unit comprising a base with wheels and a vertical housing. The base contains power supply units and cooling systems. The vertical housing contains processing units and displays. The vertical housing includes ventilation channels for thermal management. The vision cart 120 includes a primary display 122 mounted at an upper portion of the vertical housing, wherein the primary display 122 comprises a high-definition LCD monitor with anti-glare coating. The vision cart 120 includes an electrosurgical unit (ESU) 124 mounted within the vertical housing. The vision cart 120 further includes endoscope light sources. The endoscope light sources comprise one or two light source units mounted within the vertical housing. The vision cart 120 includes an insufflator unit mounted within the vertical housing for creating and maintaining pneumoperitoneum. The vision cart 120 includes an uninterruptible power supply (UPS) system mounted within the base for providing backup power. The vision cart 120 further includes a video processing unit and a central processing unit within the vertical housing. The video processing unit includes dedicated graphics processors. The central processing unit comprises multiple processing cores. The vision cart 120 further includes data storage devices mounted within the vertical housing. In some implementations, the vision cart 120 comprises image enhancement processors for contrast adjustment and noise reduction. In some implementations, the vision cart 120 includes fluorescence imaging processors for tissue identification. In some implementations, the vision cart 120 includes augmented reality processors for data overlay generation.
The master surgeon console 130 includes a base structure supporting an operator seat and control interfaces. The base structure includes levelling mechanisms for stable positioning. The operator seat comprises height adjustment mechanisms and lumbar support systems. A display unit extends upward and forward from the base structure. The display unit contains a stereoscopic display system 134. The stereoscopic display system 134 includes dual display panels and optical elements. The optical elements include focusing mechanisms and eye tracking sensors. The surgeon console 130 further includes master/input controls 132 mounted on sides of the base structure in front of the operator seat. The master/input control manipulators 132 include primary arms, secondary arms, and tertiary arms connected by joints. The joints include force feedback actuators and position sensors. The master/input control manipulators 132 terminate in ergonomic hand grips. The hand grips contain pressure sensors and multi-function triggers. In some implementations, the surgeon console 130 further includes foot pedals mounted on a lower portion of the base structure. The foot pedals include position sensors and tactile feedback mechanisms. A user interface comprising touchscreens mounts on the base structure between the master/input control manipulators 132. The touchscreens display system status information and configuration controls.
FIG. 2 is a diagram illustrating a surgeon master console, in accordance with an embodiment of the present disclosure. With reference to FIG. 2, there is shown a robotic surgical system surgeon master console (200) comprising a display unit (202), an input control (204). The surgeon master console further comprises a master control (206) along with other computational resources (not shown) to enable smooth operation of the robotic surgical system. The master control is generally an industrial PC or single board PC in accordance with an embodiment of the invention.
In operation, a surgeon would perform a surgery using the input control (204) to maneuver a surgical instrument in a surgical workplace. The display unit (202) would provide real rime feedback and data to the surgeon, facilitating informed decision-making during the surgical process on the movement of the surgical instrument, thereby enabling accurately and safely perform surgical procedures.
FIG. 3A is a diagram illustrating a surgical unit, in accordance with an embodiment of the present disclosure. With reference to FIG. 3A, there is shown a surgical unit comprising one or more surgical instrument arms mounted at a first end of the surgical unit. In an embodiment it is illustrated that only one of the one or more surgical instrument arms is connected to the surgical unit at a first end while the rest of the surgical instrument arms are interconnected. using appropriate linking mechanism known in the art. In another embodiment a single surgical arm can be connected to the surgical unit at a first end to perform the same functions of the one or more surgical instrument arms, without deviating from the scope of the invention as claimed. In an embodiment an endoscope (310) comprising a camera (308) is connected to the surgical instrument arm at an interface (306). In another embodiment, one or more surgical units can be provided with a surgical instrument arm mounted at a first end. The surgical instrument tool comprising a tool tip (not shown) is connected to the surgical instrument arm at an interface using a sterile adaptor. In an exemplary implementation, there are four surgical units with surgical instrument arms, in which three arms are connected at distal end to surgical instrument tool comprising tool tip for performing surgical procedure and one arm is connected to an endoscope comprising a camera for endoscopic imaging and providing real-time feedback to surgeon on his movements to the input control.
The surgeon master console (200) and the surgical unit (300) are connected through a communication network. The communication network comprises fiber optic cables, ethernet cables or wireless network, for high-speed data transmission. The communication network includes redundant data pathways. The communication network transmits control signals from the surgeon master console to the surgical unit. The control signals include control commands. The communication network transmits imaging data from the camera of the surgical unit to the display unit of the surgeon maser console. The imaging data includes real time images providing visualization of the surgical field/workspace.
FIG. 3B and FIG. 3C are diagrams illustrating an exploded view of a surgical unit comprising a surgical instrument tool tip and a camera respectively, in accordance with an embodiment of the present disclosure, with reference to FIG. 3B, the distal end of the surgical instrument tool (320) is shown comprising a surgical instrument tool tip (318). The proximal end of the surgical instrument tool is connected to the surgical arm at an interface (316) using a sterile adapter. An exploded view (DETAIL A) of the instrument tool tip is shown that depicts the surgical instrument tool comprising the tool tip at the distal end.
With reference to FIG. 3C, an endoscope (310) is shown comprising a camera at the distal end. The proximal end is connected to the surgical arm at an interface (306). It should be noted that the interface in FIGs. 3B and 3C is different as is evident from the drawings and the numbering (306, 316). The figures are indicative that, the shape, size and connection mechanism of the interface varies with the type of instrument (endoscope/surgical instrument) being connected at the interfaces (306) and (316). An exploded view (DETAIL B) of the instrument tool tip is shown that depicts the camera. In an embodiment, the surgical workspace is provided with one or more surgical units (300). At least one of the surgical units (300) comprises a camera (308) as depicted in exploded view (DETAIL B) of FIG. 3C.
FIG. 4 is a diagram illustrating the master controller frame and the display frame, in accordance with an embodiment of the present disclosure. With reference to FIG. 4, a surgeon master console (400) is depicted with a 3D cartesian coordinate systems representing the master controller frame (402) and display unit frame (404). As depicted the master controller frame (402) is established with reference to input control of the master surgeon console as the origin and can be referenced by the display unit frame FD. The master controller frame FM is defined by the XM-YM-ZM axes. The axes of the master controller frame FM are related to input control and the display unit with the XM axis being towards and away from the display unit, the YM axis being horizontal or transverse to the display unit, e.g., left/right relative to the display unit, and the ZM axes being vertical relative to the floor. The display unit frame FD is defined by the XD-YD-ZD axes. The display unit frame can be referenced by the master frame FM and camera frame FW. The axes of the display frame FD are related to input control and the display unit with the ZD axis being towards and away from the input control, the XD axis being horizontal or transverse to the display unit, e.g., left/right relative to the display unit, and the YD axes being vertical relative to the floor. The YD axis of the display unit frame represents a tilt in the display unit by an angle θD. This angular tilt is represented by a rotation matrix Ry(θD). By incorporating this tilt values in the display rotation matrix, the system aligns the display to the optimal viewing angle, tailored to the surgeon’s perspective. This ensures ergonomic comfort, reduces strain during prolonged use, and improves the overall usability of the system, thereby enhancing the clinician’s interaction with the robotic console.
FIG. 5 is a diagram illustrating the camera frame and the world frame, in accordance with an embodiment of the present disclosure. With reference to FIG. 5, a surgical unit (500) is depicted with a 3D cartesian coordinate systems representing world frame (502) and the camera frame (504). The world frame FW is a fixed frame that is fixed during a surgical procedure defined by the Xw-Yw-Zw axes. The world frame FW is a frame positioned on the floor or ground of a surgical workplace and can be referenced by the camera frame FD and instrument arm frame FI. As shown, the world frame FW has the Xw axis being defined in a horizontal direction parallel to the floor, the Yw axis being defined in a horizontal direction parallel to the floor, and the Zw axes being defined in a height direction from the floor to the ceiling. The camera frame FC is a frame of the camera to define a view of the camera relative to the surgical workplace and is defined by the Xc-Yc-Zc axes. The Xc axis and the Yc axis are perpendicular to one another and are each parallel to a plane defined by a lens of the camera. The Zc axis is orthogonal to the plane defined by the lens of the camera and is substantially directed towards or farther from the surgical workplace such that an object distance of the camera is defined along the Zc axis. The camera frame can be referenced by display unit frame FD and world frame FM.
While referencing the display frame, if the orientation of the camera frame and display frame are aligned, then no flipping is required and movements captured in the display frame are directly mapped to the camera frame without any changes. However, if the orientation of the display frame and camera frame are in opposite direction such that when camera’s view is mirrored or inverted relative to the display, then the display unit frame is flipped 1800 about the ZD axis. By compensating for this orientation mismatch, the system ensures that the surgeon’s movements on the display seamlessly correspond to the camera’s perspective, providing intuitive and accurate real-time visual feedback.
While referencing the world frame, when the camera frame is aligned with the world frame, no additional adjustments related to the endoscope is required. However, if the camera is at an offset endoscope, say θC, then an additional rotation matrix Ry(θC) is required for angular offset between the frames for ensuring correct translation between the frames. For e.g. for a 300 endoscope, θC = π/6 is employed for compensating the endoscopic offset.
FIG. 6 is a diagram illustrating the world frame and the instrument arm frame, in accordance with an embodiment of the present disclosure. With reference to FIG. 6, a surgical unit (600) is depicted with a 3D cartesian coordinate systems representing world frame FW (602) and the surgical instrument arm frame FI (604). The world frame FW is a fixed frame that is fixed during a surgical procedure defined by the Xw-Yw-Zw axes as described with reference to FIG. 5. The surgical instrument arm frame is a frame of the instrument arm established at an interconnection between the surgical instrument arm and surgical unit and is defined by the XI-YI-ZI axes. The surgical instrument arm frame can be referenced by the world frame FW and instrument tip frame FT. The XI axis and the YI axis are perpendicular to one another and are substantially parallel to the floor. The ZI axis is perpendicular to each of the XI and YI axes and is substantially vertical towards a ceiling, e.g., away from the floor.
FIG. 7 is a diagram illustrating the instrument tip frame, in accordance with an embodiment of the present disclosure. With reference to FIG. 7, a surgical instrument tool (700) is depicted with a tool tip with a 3D cartesian coordinate systems representing instrument tip frame FT (702). The instrument tip frame is established at the interconnection between the surgical instrument tool and tool tip (704). The instrument tip frame FT references the instrument arm frame FI and defined by XT-YT-ZT axes. The XT axis and the YT axis are perpendicular to one another and are substantially parallel to the floor. The ZT axis is perpendicular to each of the XT and YT axes and is substantially vertical towards a ceiling, e.g., away from the floor. The new position of the tool tip in the instrument arm frame FI is calculated by adding an incremental movement vector to the position of the instrument tool in the instrument arm frame prior to the movement.
In accordance with an embodiment, the surgical instrument tool is coupled to the surgical instrument arm using a sterile adaptor. The term "sterile adaptor" refers to a device that facilitates the attachment of surgical instruments to the robotic system while maintaining a sterile environment to prevent contamination during surgery. The surgical instrument tool is securely attached to the surgical instrument arm through a sterile adaptor, ensuring a clean and safe interface during procedures. This coupling allows for seamless integration of the tool with the robotic system, facilitating precise control over the instrument's movements.
In some implementations, the robotic surgical system executes autonomous and semi-autonomous functions. In some implementations, the robotic surgical system 100 enables system upgrades through modular component replacement. The modular component replacement includes instrument interface upgrades and processing unit upgrades.
The robotic surgical system enables minimally invasive surgical procedures. Exemplary surgical procedures may include, but not limited to, general surgery procedures, gynaecological procedures, urological procedures, cardiothoracic procedures, and otolaryngological procedures.
Modifications to embodiments of the present disclosure described in the foregoing are possible without departing from the scope of the present disclosure as defined by the accompanying claims. Expressions such as "including", "comprising", "incorporating", "have", "is" used to describe and claim the present disclosure are intended to be construed in a non-exclusive manner, namely allowing for items, components or elements not explicitly described also to be present. Reference to the singular is also to be construed to relate to the plural. The word "exemplary" is used herein to mean "serving as an example, instance or illustration". Any embodiment described as “exemplary” is not necessarily to be construed as preferred or advantageous over other embodiments and/or to exclude the incorporation of features from other embodiments. The word "optionally" is used herein to mean "is provided in some embodiments and not provided in other embodiments". It is appreciated that certain features of the present disclosure, which are, for clarity, described in the context of separate embodiments, may also be provided in combination in a single embodiment. Conversely, various features of the present disclosure, which are, for brevity, described in the context of a single embodiment, may also be provided separately or in any suitable combination or as suitable in any other described embodiment of the disclosure.
, C , C , Claims:CLAIMS
We Claim:
1. A robotic surgical system comprising;
a surgeon master console comprising an input control and a display unit;
a surgical unit comprising one or more surgical instrument arms mounted at a first end to the surgical unit, wherein the one or more surgical arms comprises an endoscope or a surgical instrument tool, wherein the surgical instrument tool comprising a tool tip and the endoscope comprises a camera; and;
a master controller, wherein the master controller is configured to:
receive movements of the input control in a master controller frame established with reference to the input control;
translate the movements of the input control in the master controller frame to movements of the surgical instrument tool in a display unit frame established with reference to the display unit;
translate the movements captured in the display unit frame to movements in a camera frame established with reference to the camera of the endoscope;
translate the movements in each of the camera frame to movements in a world frame established with reference to a surgical workspace;
translate the movements in the world frame to movements in a surgical instrument arm frame established at the first end of the one or more surgical instrument arms;
calculate a new position of the tool tip in the instrument arm frame; and
transmit control commands to one or more motors controlling the surgical instrument tool to move the tool tip to the new position.

2. The robotic surgical system of claim 1, wherein a display rotation matrix is applied to translate the movements of the input control in the master controller frame to movements of instrument tool in a display unit frame.
3. The robotic surgical system of claim 2, wherein the display rotation matrix represents a tilt of the display unit frame about y-axis by an angle θD.

4. The robotic surgical system of claim 1, wherein a camera rotation matrix is applied to the movements captured in the display unit frame to movements in a camera frame.

5. The robotic surgical system of claim 4, wherein the camera rotation matrix is flipped 180o about z-axis when the orientation of the camera and the display unit are in opposite directions.

6. The robotic surgical system of claim 5, wherein the camera rotation matrix is flipped 180o about z-axis based on an input from a surgeon to the master controller using input control of the surgeon master console.

7. The robotic surgical system of claim 1, wherein a world frame rotation matrix is applied to the movements in the camera frame to the movements in a world frame.

8. The robotic surgical system of claim 7, wherein the world frame rotation matrix is adjusted using an endoscope rotation matrix to account for angular offset between the camera frame and the world frame.

9. The robotic surgical system of claim 1, wherein an instrument arm rotation matrix is applied to the movements in the world frame to movements in a surgical instrument arm frame.

10. The robotic surgical system of claim 1, wherein the new position is calculated based on current position of the instrument and an incremental movement vector.

11. The robotic surgical system of claim 1, wherein there are four or more surgical instrument arms with at least one instrument arm mounted with an endoscope comprising a camera.

12. A method for controlling a surgical instrument tool of a surgical robot with a master controller comprises;
receiving, by the master controller, movements of an input control associated with a surgeon master console;
establishing, by the master controller, the movements of the input control in a master controller frame with reference to the position of the input control;
translating, by the master controller, the movements of the input control in the master controller frame to movements of the surgical instrument tool in a display unit frame established with reference to a display unit associated with the surgeon master console;
translating, by the master controller, the movements captured in the display unit frame to movements in a camera frame established with reference to a camera associated with one or more surgical arms;
translating, by the master controller, the movements in the camera frame to movements in a world frame established with reference to a surgical workspace;
translating, by the master controller, the movements in the world frame to movements in a surgical instrument arm frame established at the first end of the one or more surgical instrument arms; wherein the surgical instrument arm is coupled to a surgical unit at the first end;
calculating, by the master controller, a new position of a tool tip in the surgical instrument arm frame; and
transmitting, by the master controller, control commands to one or more motors controlling a surgical instrument tool to move the tool tip to the new position.

13. The method of claim 12, wherein a display rotation matrix is used translate the movements of the input control in the master controller frame to the movements of the surgical instrument tool.

14. The method of claim 13, wherein the display rotation matrix represents a tilt of the display unit frame about y-axis by an angle θD.

15. The method of claim 12, wherein a camera rotation matrix is used translate the movements captured in the display unit frame to movements in a camera frame.

16. The method of claim 15, wherein the camera rotation matrix is flipped 180o about z-axis when the orientation of the one or more camera and display unit are in opposite directions.

17. The method of claim 12, wherein a world frame rotation matrix is applied to the movements in the camera frame to the movements in a world frame.

18. The method of claim 17, wherein the world frame rotation matrix is adjusted using an endoscope rotation matrix to account for angular offset between the camera frame and the world frame.

19. The method of claim 12, wherein an instrument arm rotation matrix is applied to the movements in the world frame to movements in a surgical instrument arm frame.

20. The method of claim 12, wherein the new position is calculated based on current position of the instrument and incremental movement vector.

Documents

Application Documents

# Name Date
1 202521031550-STATEMENT OF UNDERTAKING (FORM 3) [31-03-2025(online)].pdf 2025-03-31
2 202521031550-POWER OF AUTHORITY [31-03-2025(online)].pdf 2025-03-31
3 202521031550-MSME CERTIFICATE [31-03-2025(online)].pdf 2025-03-31
4 202521031550-FORM28 [31-03-2025(online)].pdf 2025-03-31
5 202521031550-FORM-9 [31-03-2025(online)].pdf 2025-03-31
6 202521031550-FORM FOR SMALL ENTITY(FORM-28) [31-03-2025(online)].pdf 2025-03-31
7 202521031550-FORM FOR SMALL ENTITY [31-03-2025(online)].pdf 2025-03-31
8 202521031550-FORM 18A [31-03-2025(online)].pdf 2025-03-31
9 202521031550-FORM 1 [31-03-2025(online)].pdf 2025-03-31
10 202521031550-FIGURE OF ABSTRACT [31-03-2025(online)].pdf 2025-03-31
11 202521031550-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [31-03-2025(online)].pdf 2025-03-31
12 202521031550-EVIDENCE FOR REGISTRATION UNDER SSI [31-03-2025(online)].pdf 2025-03-31
13 202521031550-DRAWINGS [31-03-2025(online)].pdf 2025-03-31
14 202521031550-DECLARATION OF INVENTORSHIP (FORM 5) [31-03-2025(online)].pdf 2025-03-31
15 202521031550-COMPLETE SPECIFICATION [31-03-2025(online)].pdf 2025-03-31
16 Abstract.jpg 2025-04-08