Abstract: The application provides a mixed reality surgeon console system (106) for a multi-arm robotic surgical system (100). The surgeon console system (106) comprises a chair (112) on which a surgeon (110) can be seated, a hand gripper assembly (120) secured to the chair (112), the hand gripper assembly (120) includes a left-hand gripper (120a) and a right-hand gripper (120b) and a pair of sensors (116) attached to each, a headset holder (124) connected to the chair (112) and configured to secure a mixed reality headset (128). A foot pedal tray (122) connected to the chair (112) and configured to move inward and outward directions, a control unit (132), a transmitter (118) coupled to the pair of sensors (116), the transmitter configured to send surgeon input to the robotic surgical instruments. Figure 2
DESC:TECHNICAL FIELD
[0001] The present disclosure generally relates to a field of immersive technology applications in medical devices, and more particularly, the disclosure relates to a to a mixed reality surgeon console chair for robotic surgery environment in medical applications.
BACKGROUND
[0002] This section is intended to introduce the reader to various aspects of art that may be related to various aspects of the present disclosure, which are described below. This disclosure is believed to be helpful in providing the reader with background information to facilitate a better understanding of the various aspects of the present disclosure. Accordingly, it should be understood that these statements are to be read in this light, and not just as an admissions of prior art.
[0003] Robotically assisted surgical systems have been adopted worldwide to replace conventional surgical procedures to reduce number of extraneous tissue(s) that may be damaged during surgical or diagnostic procedures, thereby reducing patient recovery time, patient discomfort, prolonged hospital tenure, and particularly deleterious side effects. In robotically assisted surgeries, the surgeon typically operates a hand controller/ master controller/ surgeon input device at a surgeon console to seamlessly capture and transfer complex actions performed by the surgeon giving the perception that the surgeon is directly articulating surgical tools/ surgical instruments to perform the surgery. The surgeon operating on the surgeon console may be located at a distance from a surgical site or may be located within an operating theatre where the patient is being operated.
[0004] The robotically assisted surgeries have revolutionized the medical field and are one of the fastest growing sectors in the medical device industry. One of the key areas of robotically assisted surgeries is the development of surgical robots for minimally invasive surgery. Over the last couple of decades, surgical robots have evolved exponentially and have been a major area of innovation in the medical device industry.
[0005] The robotically assisted surgical systems may comprise of multiple robotic arms aiding in conducting robotic assisted surgeries. The surgeon controls the robotic arm and the instruments mounted on it by using the surgeon console. The surgeon console comprises of visualization system to allow the surgeon to perform the surgery. Further, the hand controllers/ the master controllers/ the surgeon input devices are integrated with the surgeon console which the surgeon maneuvers to perform the surgery.
[0006] Performing surgery at the surgeon console creates new challenges. One challenge is the ergonomics of the console to allow the surgeon to perform the surgery for a long duration without any fatigue. During surgery, the surgeon often requires to be seated into an exhaustive posture for many hours due to the hardware structure of such surgeon consoles. Another challenge in the current iteration of surgeon consoles comprises a large assemblage of hardware, drives, motors, and electronics, thus constituting a comprehensive component of any surgical robotics system. This limits its adaptability, thereby necessitating a market stimulus that would encourage surgeons and physicians to procure the whole surgical robotics system for use with the console. Also, the large assemblage size of the existing surgeon console reduces its portability. Further, another challenge in the current surgeon console is the unavailability of a surgeon console for simultaneous visualization of all the patient related digital data. Moreover, if the existing virtual reality headset with straps mounted on the surgeon’s head are used in robotic surgery, then it causes fatigue to the surgeon after use for long hours. Furthermore, the existing surgeon consoles used in robotic surgeries are difficult to get utilized in a tele-surgery.
[0007] In the light of aforementioned challenges, there is a need for providing an improved surgeon console to be used in a multi-arm robotic surgical system which will solve the above-mentioned problems related to robotic assisted surgeries.
SUMMARY OF THE DISCLOSURE
[0008] Some or all of the above-mentioned problems related to obtaining kinematics of a robotic cart in a multi-arm robotic surgical system are proposed to be addressed by certain embodiments of the present disclosure.
[0009] In an aspect, an embodiment of the present disclosure provides a mixed reality surgeon console system for a multi-arm robotic surgical system comprising one or more robotic arms where one of the robotic arm coupled to an endoscopic camera and the remaining robotic arms each coupled to a robotic surgical instrument at its distal end, the surgeon console system comprising: a chair on which a surgeon can be seated; a hand gripper assembly secured to the chair, the hand gripper assembly includes a left-hand gripper and a right-hand gripper, each hand gripper provided with a pair of sensors; a control unit configured to enable a two-way data communication between the robotic surgical instruments connected to the robotic arms and the surgeon console using a transmitter coupled to the pair of sensors; a headset holder connected to the chair, the headset holder is configured to secure a mixed reality headset; the mixed reality headset configured to receive holographic UI elements from the control unit and data from a plurality of sensors; and a foot pedal tray connected to the chair, the foot pedal tray configured to move in inward and outward directions.
[00010] In another aspect, an embodiment of the present disclosure provides a mixed reality surgeon console for a multi-arm robotic surgical system comprising one or more robotic arms where one of the robotic arm coupled to an endoscopic camera and the remaining robotic arms each coupled to a robotic surgical instrument at its distal end, the surgeon console comprising: a housing mounted at one end of a stand; a plurality of slots positioned at the front side of the housing; a plurality of hand controllers connected to the slots respectively, each of the hand controllers comprises a handshake gripper, a distal end ball joint operationally secured to the housing at one end, a proximal end ball joint operationally secured to the handshake gripper at one end, a telescopic linkage connecting the other ends of the distal end ball joint and the proximal end ball joint; a foot pedals assembly mounted at the other end of the stand; a control unit configured to enable a two-way communication between the hand controllers and the robotic arms by using a data transceiver; and a mixed reality headset connected to the housing, the mixed reality headset configured to receive holographic UI elements from the control unit and data from a plurality of sensors like a haptic touch of the handshake gripper, wherein the hand controllers configured to provide a magnetic brake to simulate haptic touch at the distal end ball joint and a linear telescopic mechanism.
[00011] Optionally, a set of mechanical links is mounted on sides of the chair to provide support and motion calibration to electromagnetic wave-based sensors attached to grippers.
[00012] Optionally, the transmitter may be secured anywhere on the chair.
[00013] Optionally, the foot pedal tray can be extended for use of foot pedal during surgery or can be folded into the chair for stowing when not in use.
[00014] Optionally, the foot pedal tray provides wireless and wired universal serial bus (USB) communication for more portable foot-based control access for cautery, camera toggle, arm toggle and the like.
[00015] Optionally, the mixed reality headset may be any immersive headset such as virtual reality headset, augmented reality headset and the like.
[00016] Optionally, the mixed reality headset is capable of operating in both wired and wireless modes depending on the fidelity of a signal.
[00017] Optionally, the mixed reality headset may include a camera, a microphone, a motion sensor, and a gaze tracker and the like.
[00018] Optionally, the mixed reality headset can project three-dimensional (3D) vision of an endoscope.
[00019] Optionally, the left-hand gripper and right-hand gripper contain electromagnetic wave-based sensors configured to translate position and orientation of sensors to a frame of the tool tip of robotic surgical instruments connected to the robotic arms.
[00020] Optionally, the holographic UI elements can be used to provide the surgeon with cautery power settings and toggle, endoscope toggle and control, robotic arm toggling, 3D DICOM over a virtual patient, real-time miniature virtual diorama of an operating room (OR) setup, live holographic surgeons for tele-mentoring, virtual AI assistant for intra-operative guidance with patient-specific medical data, active eye tracking of surgeons to read fatigue levels and distraction, and tangible digital twins of surgical equipment and surgical robotic system components for pre-operative practice.
[00021] Optionally, the surgeon can interact with the holographic UI elements by using hand gestures, voice commands, eye movements, and head movements.
[00022] Optionally, the 3D DICOM provides the surgeon with a virtual representation of the patient for viewing the patient's anatomy and vital signs in real-time.
[00023] Optionally, the hand controllers contain passive sensors configured to translate position and orientation of sensors to a frame of the tool tip of robotic surgical instruments connected to the robotic arms.
[00024] Optionally, the linear telescopic mechanism would be used to actuate a translational motion of the handshake gripper of each of the hand controllers.
[00025] Optionally, the distal end ball joint provides haptic feedback for Roll, Pitch, and Yaw motions of the handshake gripper.
[00026] Optionally, the telescopic linkage provides haptic feedback for in and out motions of the handshake gripper.
[00027] Optionally, the proximal end ball joint provides haptic feedback for 1:1 hand motion translation of the robotic surgical instruments connected to the robotic arms.
[00028] Other embodiments, systems, methods, apparatus aspects, and features of the invention will become apparent to those skilled in the art from the following detailed description, the accompanying drawings, and the appended claims. It will be appreciated that features of the present disclosure are susceptible to being combined in various combinations without departing from the scope of the present disclosure as defined by the appended claims.
BRIEF DESCRIPTION OF THE DRAWINGS
[00029] The summary above, as well as the following detailed description of the disclosure, is better understood when read in conjunction with the appended drawings. For the purpose of illustrating the present disclosure, exemplary constructions of the disclosure are shown in the drawings. However, the present disclosure is not limited to specific methods and instrumentalities disclosed herein. Moreover, those skilled in the art will understand that the drawings are not to the scale. Wherever possible, like elements have been indicated by identical numbers.
Embodiments of the present disclosure will now be described, by way of example only, with reference to the following diagrams wherein:
FIG. 1 illustrates an example implementation of a multi arm teleoperated surgical system which can be used with one or more features in accordance with an embodiment of the disclosure;
FIG. 2 illustrates an example implementation of a mixed reality surgeon console for robotic surgery in a multi-arm teleoperated surgical in accordance with an embodiment of the disclosure;
FIG. 3 illustrates a surgeon using the mixed reality surgeon console in accordance with an embodiment of the disclosure;
FIG. 4 illustrates a passive 7 DOF mechanical links in accordance with an embodiment of the disclosure;
FIG. 5 illustrates a foldable foot pedal in accordance with an embodiment of the disclosure;
FIG. 6 illustrates a headset of the mixed reality surgeon console in accordance with an embodiment of the disclosure;
FIG. 7 illustrates a virtual surgeon console in accordance with an embodiment of the disclosure; and
FIG. 8 illustrates a six degree of freedom (6DoF) sensor based passive hand controller in accordance with an embodiment of the disclosure.
DETAILED DESCRIPTION OF THE DISCLOSURE
[00030] For the purpose of promoting an understanding of the principles of the disclosure, reference will now be made to the embodiment illustrated in the drawings and specific language will be used to describe the same. It will nevertheless be understood that no limitation of the scope of the disclosure is thereby intended, such alterations and further modifications in the illustrated system, and such further applications of the principles of the disclosure as illustrated therein being contemplated as would normally occur to one skilled in the art to which the disclosure relates.
[00031] It will be understood by those skilled in the art that the foregoing general description and the following detailed description are exemplary and explanatory of the disclosure and are not intended to be restrictive thereof. Throughout the patent specification, a convention employed is that in the appended drawings, like numerals denote like components.
[00032] Reference throughout this specification to “an embodiment”, “another embodiment”, “an implementation”, “another implementation” or similar language means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the present disclosure. Thus, appearances of the phrase “in an embodiment”, “in another embodiment”, “in one implementation”, “in another implementation”, and similar language throughout this specification may, but do not necessarily, all refer to the same embodiment.
[00033] The terms “comprises”, “comprising”, or any other variations thereof, are intended to cover a non-exclusive inclusion, such that a process or method that comprises a list of steps does not include only those steps but may include other steps not expressly listed or inherent to such process or method. Similarly, one or more devices or sub-systems or elements or structures proceeded by “comprises... a” does not, without more constraints, preclude the existence of other devices or other sub-systems or other elements or other structures or additional devices or additional sub-systems or additional elements or additional structures.
[00034] Unless otherwise defined, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this disclosure belongs. The device, system, and examples provided herein are illustrative only and not intended to be limiting.
[00035] The terms “a” and “an” herein do not denote a limitation of quantity, but rather denote the presence of at least one of the referenced items. Further, the term sterile barrier and sterile adapter denotes the same meaning and may be used interchangeably throughout the description.
[00036] Embodiments of the disclosure will be described below in detail with reference to the accompanying drawings.
[00037] Figure 1 illustrates an example implementation of a multi arm teleoperated surgical system which can be used with one or more features in accordance with an embodiment of the disclosure. Specifically, figure 1 illustrates the multi arm teleoperated surgical system (100) having five robotic arms (102a), (102b), (102c), (102d), (102e), mounted on five robotic arm carts around an operating table (104). The five-robotic arms (102a), (102b), (102c), (102d), (102e), as depicted in figure 1, are for illustration purposes and the number of robotic arms may vary depending upon the type of surgery. The exemplary five robotic arms (102a), (102b), (102c), (102d), (102e), are arranged along the operating table (104) and may be arranged in different manner but not limited to the robotic arms (102a), (102b), (102c), (102d), (102e), arranged along the operating table (104). The robotic arms (102a), (102b), (102c), (102d), (102e), may be separately mounted on the five robotic arm carts or the robotic arms (102a), (102b), (102c), (102d), (102e), mechanically and/ or operationally connected with each other or the robotic arms (102a), (102b), (102c), (102d), (102e), connected to a central body (not shown) such that the robotic arms (102a), (102b), (102c), (102d), (102e), branch out of a central body (not shown). Further, the multi arm teleoperated surgical system (100) includes a master controller (106), a vision cart (108), and a surgical instrument and accessory table. The mixed reality surgeon console (106) can be utilized in tele-surgery.
[00038] Figure 2 illustrates an example implementation of a mixed reality surgeon console for robotic surgery in a multi-arm teleoperated surgical in accordance with an embodiment of the disclosure. The mixed reality surgeon console (106) utilizes a portable chair to accommodate a surgeon during a robotic surgical procedure. The essential components for controlling surgical robots are integrated in the surgeon console to ensure a compact and user-friendly design. This surgeon console (106) serves as the interface for a surgeon (110) within the operating theatre. The surgeon console (106) comprises of a chair (112) for the surgeon (110), a set of mechanical links (114) mounted on sides of the chair (112). The mechanical links provide support and motion calibration to electromagnetic wave-based sensors (116) attached to grippers (120). The mixed reality console (106) consists of a transmitter (118) attached to the bottom of the chair (112) and coupled to the wave-based sensors (116). The transmitter (118) enables precise control of robotic instruments through electromagnetic wave communication.
[00039] Figure 3 illustrates a surgeon using the mixed reality surgeon console in accordance with an embodiment of the disclosure. The surgeon (110) can sit comfortably on the chair (112) provided with the surgeon console (106).
[00040] Figure 4 illustrates passive spring-based 7 DOF mechanical links in accordance with an embodiment of the disclosure. The mechanical links (114) are attached to the arms (134) of the chair (112).
[00041] Figure 5 illustrates a foldable foot pedal in accordance with an embodiment of the disclosure. A retractable foot pedal tray (122) is extendible for use of foot pedal during surgery. The foot pedal tray (122) can be folded into the chair (112) for convenient stowing when not in use.
[00042] Figure 6 illustrates a headset of the mixed reality surgeon console in accordance with an embodiment of the disclosure. A mixed reality headset holder (124) is provided. The mechanical links (126) are attached to the back of the chair (112) to hold a headset (128). A microphone (136) and a speaker (138) are provided on the headset (128). Further, data from a plurality of sensors, such as a camera, a microphone (136), a speaker (138), a motion sensor, and a gaze tracker can be received by the mixed reality headset (128). The data from the sensors is received and analyzed in order to determine the actions of the user. The actions of the user can then be used to interact with the holographic UI elements. For example, the user can interact with the holographic UI elements by using hand gestures, voice commands, eye movements, and head movements. The surgeon (110) gets an augmented environment with a pass-through or see-through feature for easy vision of the operating room.
[00043] The headset (128) features a virtual screen for viewing the endoscopic feed from the endoscope, a virtual control screen for manipulating chair ergonomics, a panel for patient details, a 3D DICOM, a virtual screen for patient vitals monitoring, a virtual screen for remote proctoring access, and a virtual screen for robotic system controls. The 3D DICOM provides the user with a virtual representation of the patient so that the user can view the patient's anatomy and vital signs in real-time. Additionally, the 3D DICOM can be used to track and monitor changes to the patient's anatomy and vital signs throughout the procedure. The holographic UI elements can be used to provide the user with cautery power settings and toggle, endoscope toggle and control, robotic arm toggling, 3D DICOM over a virtual patient, real-time miniature virtual diorama of an operating room (OR) setup, live holographic surgeons for tele-mentoring, virtual AI assistant for intra-operative guidance with patient-specific medical data, active eye tracking of surgeons to read fatigue levels and distraction, and tangible digital twins of surgical equipment and surgical robotic system components for pre-operative practice. The holographic UI elements can be used to provide the user with a real-time miniature virtual diorama of an operating room (OR) setup. The virtual diorama provides the user with a virtual representation of the OR setup so that the user can view the layout of the OR and the placement of the surgical equipment.
[00044] Additionally, the virtual diorama can be used to track and monitor changes to the OR setup in real-time. The holographic UI elements can be used to provide the user with a live holographic surgeon for tele-mentoring. The live holographic surgeon provides the user with a virtual representation of a surgeon who can guide the user through the steps of the procedure and answer questions as they arise. The holographic UI elements can be used to provide the user with a virtual AI assistant for intra-operative guidance with patient-specific medical data. The virtual AI assistant can provide the user with real-time guidance based on the patient's medical data, such as vital signs, laboratory results, and diagnostics. Additionally, the AI assistant can be used to track and monitor changes to the patient's medical data in real-time. The holographic UI elements can be used to provide the user with active eye tracking of surgeons to read fatigue levels and distraction. The active eye tracking can monitor the user's eye movements in order to determine if the user is becoming fatigued or distracted during the procedure. The holographic UI elements can be used to provide the user with tangible digital twins of surgical equipment and surgical robotic system components for pre-operative practice. The tangible digital twins can provide the user with a virtual representation of the surgical equipment and robotic system components so that the user can practice using the equipment and components before performing the procedure.
[00045] The headset (128) includes a provision for 3D notifications using the holographic UI for important troubleshooting and surgery status data. A detachable console box (130) is placed under the chair (112). The console box (130) houses a control unit (132) having all processors. The control unit (132) enables two-way data communication between the robotic surgical instruments connected to the robotic arms (102a), (102b), (102c), (102d), (102e) and the surgeon console (106), thus ensuring a streamlined and organized setup.
[00046] Further, the holographic UI elements can be used to provide the user with a real-time miniature virtual diorama of an operating room (OR) setup. The virtual diorama provides the user with a virtual representation of the OR setup so that the user can view the layout of the OR and the placement of the surgical equipment. Additionally, the virtual diorama can be used to track and monitor changes to the OR setup in real-time. The holographic UI elements can be used to provide the user with a live holographic surgeon for tele-mentoring. The live holographic surgeon provides the user with a virtual representation of a surgeon who can guide the user through the steps of the procedure and answer questions as they arise.
[00047] The surgeon (110) sits upright on the chair (112) and leans back, with the mixed reality headset (128) positioned for optimal viewing. The passive spring-based mechanical links (114) provide support and calibration for the electromagnetic wave sensors (116) attached to the grippers (120). This enables precise control of the robotic instruments. The retractable foot pedal tray (122) can be utilized during surgery and conveniently stowed when not needed. The mixed reality headset (128) with adjustable mechanical links (126) ensures easy manipulation for the surgeon’s comfort. The processor and the control unit of the console box (130) under the chair facilitate efficient power and data communication.
[00048] Figure 7 illustrates an exemplary variation of a mixed reality surgeon console in accordance with an embodiment of the disclosure. The virtual surgeon console (200) includes a housing (201), a plurality of slots (203a), (203b) positioned at the front side of the housing (201). A plurality of hand controllers (205a), (205b) is operationally configured to the slots (203a), (203b) respectively. Further, the housing (201) is mounted on one end of a stand (207) and a foot pedals assembly (207) is mounted on the other end of the stand (207). The foot pedal assembly (207) may be a detachable foot pedal board with wireless and wired universal serial bus (USB) communication for more portable foot-based control access for cautery, camera toggle, arm toggle and the like. An immersive headset such as a mixed reality headset (211) is connected to the housing (201) by a wired means (213). The mixed reality headset (211) may any immersive headset such as virtual reality headset, augmented reality headset and the like. In an alternative embodiment, the connection between the mixed reality headset (211) and the housing (201) may be by wireless means.
[00049] Figure 8 illustrates a six degree of freedom (6DoF) sensor (407) based passive hand controller in accordance with an embodiment of the disclosure. The hand controller may be designed to transmit world space position and orientation data to a robotic arm. The hand controller comprises of an ergonomically designed handshake gripper (409) which is equipped with ball joints of sixty-degree 6DOF motion at both the distal and proximal ends. The distal end ball joint (401) is operationally secured to the housing (201) (as illustrated in figure 7) and the proximal end ball joint (405) is operationally secured to the handshake gripper (409). Furthermore, a provision for a magnetic brake to simulate haptic touch at the distal ends of the ball joints (401) is provided. In addition, the system is equipped with a linear telescoping mechanism/parallel linkage mechanism (403) that would be used to actuate translational motion of the handshake gripper (409) of each of the hand controllers (205a) (205b).
[00050] The hand controller is designed to be easy to use, highly accurate and reliable, and comfortable to handle. The hand controller system is also designed to be lightweight and portable for the convenience of surgeons. The housing (201) (as illustrated in figure 7) includes all the electronics and magnetic brakes and encoders on a distal end of ball Joint (401) to simulate haptics as well storage provision for the mixed reality headset (211). A control unit (215) of an industrial computer (with EtherCAT modules and high-speed data communication modules) is configured to receive 12G SDI video signal from the endoscope and enable a two-way communication between the hand controllers (205a, 205b) and the robotic arms (102a), (102b), (102c), (102d), (102e) by using a data transceiver (217). The distal end ball joint (401) and the proximal end ball joint (405) can move in a selected permissible range. The distal end ball joint (401) provides haptic feedback for Roll, Pitch, and Yaw motions. The telescopic linkage (403) provides haptic feedback for in and out motions. The proximal end ball joint (405) provides haptic feedback for 1:1 hand motion translation.
[00051] The present disclosure has the following advantages: The surgeon console of the present disclosure is portable and has an ergonomic design. Further, the surgeon’s comfort and control during robotic surgery is enhanced. The integration of essential components is compact. A precise instrument control is obtained through electromagnetic wave-based sensors. Also, the mixed reality environment for comprehensive surgical monitoring and control is achieved.
[00052] The foregoing description of exemplary embodiments of the present disclosure has been presented for purposes of illustration and description. They are not intended to be exhaustive or to limit the disclosure to the precise forms disclosed, and obviously many modifications and variations are possible in light of the above teaching. The exemplary embodiment was chosen and described in order to best explain the principles of the disclosure and its practical application, to thereby enable others skilled in the art to best utilize the disclosure and various embodiments with various modifications as are suited to the particular use contemplated. It is understood that various omissions, substitutions of equivalents are contemplated as circumstance may suggest or render expedient but is intended to cover the application or implementation without departing from the spirit or scope of the claims of the present disclosure.
[00053] Benefits, other advantages, and solutions to problems have been described above with regard to specific embodiments. However, the benefits, advantages, solutions to problems, and any component(s) that may cause any benefit, advantage, or solution to occur or become more pronounced are not to be construed as a critical, required, or essential feature or component of any or all the claims.
[00054] While specific language has been used to describe the disclosure, any limitations arising on account of the same are not intended. As would be apparent to a person in the art, various working modifications may be made to the apparatus in order to implement the inventive concept as taught herein.
,CLAIMS:1. A mixed reality surgeon console system (106) for a multi-arm robotic surgical system (100) comprising one or more robotic arms (102a), (102b), (102c), (102d), (102e) where one of the robotic arm (102a) coupled to an endoscopic camera and the remaining robotic arms (102b), (102c), (102d), (102e) each coupled to a robotic surgical instrument at its distal end, the surgeon console system (106) comprising:
a chair (112) on which a surgeon (110) can be seated;
a hand gripper assembly (120) secured to the chair (112), the hand gripper assembly (120) includes a left-hand gripper (120a) and a right-hand gripper (120b), each hand gripper (120a) (120b) provided with a pair of sensors (116);
a control unit (132) configured to enable a two-way data communication between the robotic surgical instruments connected to the robotic arms (102a), (102b), (102c), (102d), (102e) and the surgeon console (106) using a transmitter (118) coupled to the pair of sensors (116);
a headset holder (124) connected to the chair (112), the headset holder (124) is configured to secure a mixed reality headset (128); the mixed reality headset (128) configured to receive holographic UI elements from the control unit (132) and data from a plurality of sensors (116); and
a foot pedal tray (122) connected to the chair (112), the foot pedal tray (122) configured to move in inward and outward directions.
2. The mixed reality surgeon console system (106) as claimed in claim 1, wherein a set of mechanical links (114) is mounted on sides of the chair (112) to provide support and motion calibration to electromagnetic wave-based sensors (116) attached to grippers (120a) (120b).
3. The mixed reality surgeon console system (106) as claimed in claim 1, wherein the transmitter (118) may be secured anywhere on the chair (112).
4. The mixed reality surgeon console system (106) as claimed in claim 1, wherein the foot pedal tray (122) can be extended for use of foot pedal during surgery or can be folded into the chair (112) for stowing when not in use.
5. The mixed reality surgeon console system (106) as claimed in claim 1, wherein the foot pedal tray (122) provides wireless and wired universal serial bus (USB) communication for more portable foot-based control access for cautery, camera toggle, arm toggle and the like.
6. The mixed reality surgeon console system (106) as claimed in claim 1, wherein the mixed reality headset (128) may be any immersive headset such as virtual reality headset, augmented reality headset and the like.
7. The mixed reality surgeon console system (106) as claimed in claim 1, wherein the mixed reality headset (128) is capable of operating in both wired and wireless modes depending on the fidelity of a signal.
8. The mixed reality surgeon console system (106) as claimed in claim 1, wherein the mixed reality headset (128) may include a camera, a microphone, a motion sensor, and a gaze tracker and the like.
9. The mixed reality surgeon console system (106) as claimed in claim 1, wherein the mixed reality headset (128) can project three-dimensional (3D) vision of an endoscope.
10. The mixed reality surgeon console system (106) as claimed in claim 1, wherein the left-hand gripper (120a) and right-hand gripper (120b) contain electromagnetic wave-based sensors (116) configured to translate position and orientation of sensors (116) to a frame of the tool tip of robotic surgical instruments connected to the robotic arms (102a), (102b), (102c), (102d), (102e).
11. The mixed reality surgeon console system (106) as claimed in claim 1, wherein holographic UI elements can be used to provide the surgeon (110) with cautery power settings and toggle, endoscope toggle and control, robotic arm toggling, 3D DICOM over a virtual patient, real-time miniature virtual diorama of an operating room (OR) setup, live holographic surgeons for tele-mentoring, virtual AI assistant for intra-operative guidance with patient-specific medical data, active eye tracking of surgeons to read fatigue levels and distraction, and tangible digital twins of surgical equipment and surgical robotic system components for pre-operative practice.
12. The mixed reality surgeon console system (106) as claimed in claim 1, wherein the surgeon (110) can interact with the holographic UI elements by using hand gestures, voice commands, eye movements, and head movements.
13. The mixed reality surgeon console system (106) as claimed in claim 1, wherein the 3D DICOM provides the surgeon (110) with a virtual representation of the patient for viewing the patient's anatomy and vital signs in real-time.
14. A mixed reality surgeon console (200) for a multi-arm robotic surgical system (100) comprising one or more robotic arms (102a), (102b), (102c), (102d), (102e) where one of the robotic arm (102a) coupled to an endoscopic camera and the remaining robotic arms (102b), (102c), (102d), (102e) each coupled to a robotic surgical instrument at its distal end, the surgeon console (200) comprising:
a housing (201) mounted at one end of a stand (207);
a plurality of slots (203a) (203b) positioned at the front side of the housing (201);
a plurality of hand controllers (205a) (205b) connected to the slots (203a) (203b) respectively, each of the hand controllers (205a) (205b) comprises a handshake gripper (409), a distal end ball joint (401) operationally secured to the housing (201) at one end, a proximal end ball joint (405) operationally secured to the handshake gripper (409) at one end, a telescopic linkage (403) connecting the other ends of the distal end ball joint (401) and the proximal end ball joint (405);
a foot pedals assembly (209) mounted at the other end of the stand (207);
a control unit (215) configured to enable a two-way communication between the hand controllers (205a, 205b) and the robotic arms (102a), (102b), (102c), (102d), (102e) by using a data transceiver (217); and
a mixed reality headset (211) connected to the housing (201), the mixed reality headset (211) configured to receive holographic UI elements from the control unit (215) and data from a plurality of sensors like a haptic touch of the handshake gripper (409),
wherein the hand controllers (205a) (205b) configured to provide a magnetic brake to simulate haptic touch at the distal end ball joint (401) and a linear telescopic mechanism (403).
15. The surgeon console (200) as claimed in claim 14, wherein the foot pedals assembly (209) may be a detachable foot pedal board with wireless and wired universal serial bus (USB) communication for more portable foot-based control access for cautery, camera toggle, arm toggle and the like.
16. The surgeon console (200) as claimed in claim 14, wherein the mixed reality headset (211) may be connected to the housing (201) by a wired means (213).
17. The surgeon console (200) as claimed in claim 14, wherein the mixed reality headset (211) may be any immersive headset such as virtual reality headset, augmented reality headset and the like.
18. The surgeon console (200) as claimed in claim 14, wherein the mixed reality headset (211) can project three-dimensional (3D) vision of an endoscope.
19. The surgeon console (200) as claimed in claim 14, wherein the hand controllers (205a) (205b) contain passive sensors configured to translate position and orientation of sensors to a frame of the tool tip of robotic surgical instruments connected to the robotic arms (102a), (102b), (102c), (102d), (102e).
20. The surgeon console (200) as claimed in claim 14, wherein the holographic UI elements can be used to provide the user with cautery power settings and toggle, endoscope toggle and control, robotic arm toggling, 3D DICOM over a virtual patient, real-time miniature virtual diorama of an operating room (OR) setup, live holographic surgeons for tele-mentoring, virtual AI assistant for intra-operative guidance with patient-specific medical data, active eye tracking of surgeons to read fatigue levels and distraction, and tangible digital twins of surgical equipment and surgical robotic system components for pre-operative practice.
21. The surgeon console (200) as claimed in claim 14, wherein the plurality of sensors can be any one of a camera, a microphone, a motion sensor, and a gaze tracker and the like.
22. The surgeon console (200) as claimed in claim 14, wherein a user can interact with the holographic UI elements by using hand gestures, voice commands, eye movements, and head movements.
23. The surgeon console (200) as claimed in claim 14, wherein the linear telescopic mechanism (403) would be used to actuate a translational motion of the handshake gripper (409) of each of the hand controllers (205a) (205b).
24. The surgeon console (200) as claimed in claim 14, wherein the distal end ball joint (401) provides haptic feedback for Roll, Pitch, and Yaw motions of the handshake gripper (409).
25. The surgeon console (200) as claimed in claim 14, wherein the telescopic linkage (403) provides haptic feedback for in and out motions of the handshake gripper (409).
26. The surgeon console (200) as claimed in claim 14, wherein the proximal end ball joint (405) provides haptic feedback for 1:1 hand motion translation of the robotic surgical instruments connected to the robotic arms (102a), (102b), (102c), (102d), (102e).
| # | Name | Date |
|---|---|---|
| 1 | 202311024625-PROVISIONAL SPECIFICATION [31-03-2023(online)].pdf | 2023-03-31 |
| 2 | 202311024625-FORM 3 [31-03-2023(online)].pdf | 2023-03-31 |
| 3 | 202311024625-FORM 1 [31-03-2023(online)].pdf | 2023-03-31 |
| 4 | 202311024625-ENDORSEMENT BY INVENTORS [31-03-2023(online)].pdf | 2023-03-31 |
| 5 | 202311024625-DRAWINGS [31-03-2023(online)].pdf | 2023-03-31 |
| 6 | 202311024625-RELEVANT DOCUMENTS [20-04-2023(online)].pdf | 2023-04-20 |
| 7 | 202311024625-POA [20-04-2023(online)].pdf | 2023-04-20 |
| 8 | 202311024625-FORM 13 [20-04-2023(online)].pdf | 2023-04-20 |
| 9 | 202311024625-Proof of Right [26-04-2023(online)].pdf | 2023-04-26 |
| 10 | 202311024625-GPA-110823.pdf | 2023-10-03 |
| 11 | 202311024625-Correspondence-110823.pdf | 2023-10-03 |
| 12 | 202311024625-ENDORSEMENT BY INVENTORS [22-03-2024(online)].pdf | 2024-03-22 |
| 13 | 202311024625-DRAWING [22-03-2024(online)].pdf | 2024-03-22 |
| 14 | 202311024625-CORRESPONDENCE-OTHERS [22-03-2024(online)].pdf | 2024-03-22 |
| 15 | 202311024625-COMPLETE SPECIFICATION [22-03-2024(online)].pdf | 2024-03-22 |
| 16 | 202311024625-Request Letter-Correspondence [24-04-2024(online)].pdf | 2024-04-24 |
| 17 | 202311024625-Power of Attorney [24-04-2024(online)].pdf | 2024-04-24 |
| 18 | 202311024625-Form 1 (Submitted on date of filing) [24-04-2024(online)].pdf | 2024-04-24 |
| 19 | 202311024625-Covering Letter [24-04-2024(online)].pdf | 2024-04-24 |
| 20 | 202311024625-PA [12-05-2024(online)].pdf | 2024-05-12 |
| 21 | 202311024625-FORM28 [12-05-2024(online)].pdf | 2024-05-12 |
| 22 | 202311024625-FORM FOR SMALL ENTITY [12-05-2024(online)].pdf | 2024-05-12 |
| 23 | 202311024625-EVIDENCE FOR REGISTRATION UNDER SSI [12-05-2024(online)].pdf | 2024-05-12 |
| 24 | 202311024625-ASSIGNMENT DOCUMENTS [12-05-2024(online)].pdf | 2024-05-12 |
| 25 | 202311024625-8(i)-Substitution-Change Of Applicant - Form 6 [12-05-2024(online)].pdf | 2024-05-12 |
| 26 | 202311024625-FORM-9 [28-05-2024(online)].pdf | 2024-05-28 |
| 27 | 202311024625-MSME CERTIFICATE [29-05-2024(online)].pdf | 2024-05-29 |
| 28 | 202311024625-FORM28 [29-05-2024(online)].pdf | 2024-05-29 |
| 29 | 202311024625-FORM 18A [29-05-2024(online)].pdf | 2024-05-29 |
| 30 | 202311024625-Others-100724.pdf | 2024-07-12 |
| 31 | 202311024625-GPA-100724.pdf | 2024-07-12 |
| 32 | 202311024625-Correspondence-100724.pdf | 2024-07-12 |
| 33 | 202311024625-FER.pdf | 2024-07-29 |
| 34 | 202311024625-FORM 4 [21-01-2025(online)].pdf | 2025-01-21 |
| 35 | 202311024625-POA [12-02-2025(online)].pdf | 2025-02-12 |
| 36 | 202311024625-FORM 13 [12-02-2025(online)].pdf | 2025-02-12 |
| 37 | 202311024625-AMENDED DOCUMENTS [12-02-2025(online)].pdf | 2025-02-12 |
| 38 | 202311024625-Form-4 u-r 138 [25-02-2025(online)].pdf | 2025-02-25 |
| 39 | 202311024625-FER_SER_REPLY [25-02-2025(online)].pdf | 2025-02-25 |
| 40 | 202311024625-DRAWING [25-02-2025(online)].pdf | 2025-02-25 |
| 41 | 202311024625-CORRESPONDENCE [25-02-2025(online)].pdf | 2025-02-25 |
| 42 | 202311024625-COMPLETE SPECIFICATION [25-02-2025(online)].pdf | 2025-02-25 |
| 43 | 202311024625-CLAIMS [25-02-2025(online)].pdf | 2025-02-25 |
| 44 | 202311024625-ABSTRACT [25-02-2025(online)].pdf | 2025-02-25 |
| 45 | 202311024625-GPA-120325.pdf | 2025-03-17 |
| 46 | 202311024625-Correspondence-120325.pdf | 2025-03-17 |
| 1 | searchstrategy_202311024625E_26-07-2024.pdf |