Abstract: ROBOTIC SURGICAL SYSTEM WITH INSTRUMENT VALIDATION AND USAGE MONITORING SUBSYSTEM ABSTRACT A robotic surgical system including a patient-side cart comprising a plurality of robotic arms, where at least one robotic arm is configured to hold a surgical instrument, a surgeon console, a vision cart, and a sub-system configured to validate and monitor usage of surgical instrument. The sub-system comprises a local edge computing device comprising a processor operatively connected to a Near Field Communication (NFC) reader configured to read passive NFC tag embedded on surgical instrument and an endoscopic camera configured to capture video stream of surgical site in real-time or near real-time. The processor is configured to execute trained AI model to identify a type of surgical instrument, perform dual verification by comparing type of surgical instrument identified by trained AI model with unique instrument identifier retrieved from NFC tag, and inhibit continuation of surgical procedure and raise alert if unique instrument identifier and AI-identified instrument type do not match. FIG. 2
Description:TECHNICAL FIELD
[0001] The present disclosure relates generally to the field of robotic surgical systems, and more particularly to a robotic surgical system comprising a sub-system for instrument validation and usage monitoring and a method for validation and usage monitoring of a surgical instrument used in a robotic surgical system.
BACKGROUND
[0002] Surgical robotic systems have significantly enhanced the quality and efficiency of minimally invasive procedures by enabling precise, controlled, and dexterous manipulation of surgical instruments. The surgical robotic systems generally include a robotic arm to hold interchangeable surgical instruments, a surgeon console for remote control, and imaging systems for real-time visualization. The surgical instruments used in the robotic surgical systems, such as graspers, scissors, needle holders, and energy devices, are mechanically complex and designed for a limited number of surgical uses. The accurate identification and tracking of the surgical instruments are essential to ensure patient safety and enhanced system performance.
[0003] Conventional surgical robotic systems utilize contact-based communication methods, such as Inter-Integrated Circuit (I2C) protocols via pogo-pin interfaces, to retrieve and store instrument-related data, including Instrument Identifier (ID), part number, and usage count. These data are typically stored in Electrically Erasable Programmable Read-Only Memory (EEPROM) located within each instrument. However, such contact-based systems present several limitations. The physical connectors are prone to wear and failure, and the stored data can potentially be tampered with, allowing unauthorized extension of instrument use beyond recommended safety threshold values. Moreover, because instrument validation is based solely on internal data, a mismatch between the physical instrument and the stored configuration, such as an incorrect ID can lead to malfunction of the surgical robotic system and cause a serious safety risk for the patient.
[0004] In addition, current surgical robotic systems lack centralized tracking mechanisms. The instrument usage data is stored locally, making it difficult to manage instrument inventory across multiple hospitals or surgical centres. This limits collaboration and transparency, and increases the likelihood of unauthorized reuse or mismanagement of surgical instruments. Furthermore, conventional robotic surgical systems do not offer any independent verification of the actual instrument type once it is loaded into the robotic surgical system. As a result, the robotic surgical system cannot autonomously confirm whether the surgical instrument connected physically matches the expected instrument ID or configuration.
[0005] Therefore, in light of the foregoing discussion, there exists a need to overcome the aforementioned drawbacks of existing instrument validation and usage monitoring systems in the surgical robotic platforms.
SUMMARY
[0006] The present disclosure provides a robotic surgical system comprising a sub-system for instrument validation and usage monitoring and a method for validation and usage monitoring of a surgical instrument used in a robotic surgical system. The present disclosure provides a solution to the technical problem of ensuring accurate, tamper-resistant, and real-time identification of surgical instruments while enforcing usage limits across multiple facilities. An aim of the present disclosure is to provide a solution that overcomes at least partially the problems encountered in the prior art by incorporating a local edge computing device operatively connected to a Near Field Communication (NFC) reader and an endoscopic camera, enabling dual verification of the surgical instrument through Artificial Intelligence (AI)-based visual classification and NFC-based identification.
[0007] One or more objectives of the present disclosure is achieved by the solutions provided in the enclosed embodiments. Advantageous implementations of the present disclosure are further defined in the enclosed embodiments.
[0008] In one aspect, the present disclosure provides a robotic surgical system, comprising:
a patient-side cart comprising a plurality of robotic arms, where at least one of the plurality of robotic arms is configured to hold a surgical instrument;
a surgeon console configured to receive inputs from a surgeon to control the plurality of robotic arms;
a vision cart configured to process and display images from a surgical site; and
a sub-system configured to validate and monitor usage of the surgical instrument, the sub-system comprising:
a local edge computing device comprising a processor operatively connected to:
a Near Field Communication (NFC) reader configured to read a passive NFC tag embedded on the surgical instrument to retrieve a unique instrument identifier (ID) and a usage count of the surgical instrument; and
an endoscopic camera configured to capture a video stream of the surgical site in real-time or near real-time;
wherein the processor is configured to:
execute a trained Artificial Intelligence (AI) model to identify a type of the surgical instrument in real-time or near real-time from the endoscopic video stream;
perform a dual verification by comparing the type of the surgical instrument identified by the trained AI model with the unique instrument identifier retrieved from the passive NFC tag;
transmit metadata associated with the surgical instrument to a centralized cloud database over a secure communication protocol, wherein the centralized cloud database is configured to maintain records of the surgical instrument usage across multiple surgical systems and facilities, and to prevent reuse of the surgical instrument beyond authorized limits; and
inhibit continuation of a surgical procedure and raise an alert if the unique instrument identifier and the AI-identified instrument type do not match, or if the usage count of the surgical instrument exceeds the authorized limit, thereby enhancing patient safety by preventing use of an incorrect or unauthorized surgical instrument.
[0009] The disclosed robotic surgical system provides an intelligent and safety-enhanced design that facilitates accurate surgical instrument validation and usage monitoring during robotic-assisted surgical procedures. The robotic surgical system includes the patient-side cart featuring the plurality of robotic arms, where at least one robotic arm is configured to hold and manipulate the surgical instrument with precision. The surgeon console functions as the primary interface, allowing the surgeon to control the plurality of robotic arms while viewing real-time surgical site imagery processed and displayed through the vision cart. To ensure correct instrument usage and prevent unauthorized reuse, the robotic surgical system incorporates the sub-system comprising the local edge computing device operatively connected to the NFC reader and the endoscopic camera. The NFC reader retrieves a unique instrument identifier and usage count from the passive NFC tag embedded on the surgical instrument, while the endoscopic camera captures a real-time video stream of the surgical site. The edge processor is configured to execute the trained AI model to visually identify the instrument type from the video stream and perform dual verification by comparing this identification with the NFC-retrieved data. If a mismatch is detected, the system inhibits the continuation of the surgical procedure and triggers a safety alert to the user. Additionally, the instrument metadata is transmitted over a secure protocol to a centralized cloud database, which maintains comprehensive usage records across multiple systems and facilities, and enforces limits on reusability of the surgical instrument. By integrating wireless identification, the AI-driven visual validation, and centralized data tracking, the disclosed robotic surgical system significantly enhances procedural safety, operational transparency, and multi-institutional instrument management.
[0010] In an implementation form, the trained AI model is configured to classify the surgical instrument based on tip geometry and articulation features. By using the trained AI model to detect fine-grained visual distinctions based on the tip geometry and movement mechanics, the sub-system significantly enhances its ability to perform real-time, high-confidence identification of the surgical instrument, even in complex surgical environments.
[0011] In a further implementation form, the processor is configured to periodically re-train the trained AI model using updated labelled imaging datasets of surgical instruments comprising graspers, scissors, needle holders, energy devices, staplers, and suction-irrigation tools. The periodic re-training allows the trained AI model to learn from new examples and edge cases, thereby enhancing its ability to make accurate and reliable classifications during live surgical procedures. This ensures that the sub-system continues to deliver safe, error-resistant, and context-aware instrument validation.
[0012] In a further implementation form, the NFC tag is further configured to store additional metadata comprising at least one of: manufacturer identifier, sterilization history, model number of the surgical instrument, or expiration date. By storing the additional metadata, such as manufacturer identifier, sterilization history, model number, and expiration date within the NFC tag embedded in each surgical instrument, the system enhances traceability, ensures compliance with sterilization protocols, supports proactive maintenance, and mitigates the risk of using expired or unauthorized tools, thereby significantly improving patient safety and operational reliability across globally networked surgical facilities.
[0013] In a further implementation form, the secure communication protocol comprises an encrypted connection established using Transport Layer Security (TLS) and a message queuing protocol configured to transmit metadata to the centralized cloud database. Using TLS encryption ensures that the data is protected from interception, manipulation, or cyberattacks during transfer. Meanwhile, employing the message queuing protocol provides reliable and efficient communication, particularly in systems where data must be transmitted continuously, even in environments with intermittent connectivity.
[0014] In a further implementation form, the processor is further configured to locally store a temporary log of instrument verification events within the local edge computing device prior to transmission to the centralized cloud database. By storing a local copy of verification events, the sub-system ensures that no critical data is lost during such periods. Additionally, local logging allows for faster access to recent events for on-the-spot diagnostics or audits, without needing to query the centralized cloud database.
[0015] In a further implementation form, the centralized cloud database is further configured to generate a usage report or trigger an alert when the surgical instrument approaches a predefined maximum usage threshold. By proactively notifying the system or the clinical staff before the usage limit is exceeded, the centralized cloud database helps ensure that instruments are retired or replaced promptly, thereby enhancing preventive maintenance, safety compliance, and inventory management.
[0016] In a further implementation form, the centralized cloud database is further configured to store instrument-specific data recorded at the time of manufacturing, where the instrument-specific data includes one or more of: a unique instrument identifier, an instrument type and an authorized usage count, and thereby enabling traceability and post-operative verification of the surgical instrument in case of the system malfunction or suspected tampering with the sub-system. This feature enables reliable traceability and post-operative verification of the surgical instrument by providing access to immutable and manufacturer-recorded data. In the event of the system malfunction or suspected tampering with the instrument validation sub-system, the centralized cloud database serves as a secure and authoritative reference, allowing healthcare providers to confirm the authenticity, type, and usage history of each instrument. This enhances accountability, supports regulatory compliance, and reinforces patient safety by ensuring only validated instruments are used in surgical procedures.
[0017] In a further implementation form, the sub-system is further configured to permit an override of the surgical procedure inhibition upon confirmation from the surgeon through a secured interface. By providing the option for override of the surgical procedure ensures clinical flexibility and decision-making autonomy, while maintaining accountability and traceability through secure access control.
[0018] In a further implementation form, the endoscopic camera is a stereoscopic camera configured to capture three-dimensional images of the surgical instrument and the surgical site. The 3D imagery enhances the performance of AI-based instrument recognition, allowing the sub-system to distinguish between instruments with similar shapes or profiles based on their depth and articulation characteristics.
[0019] In a further implementation form, the vision cart is further configured to display a visual alert on a user interface when a mismatch is detected during the dual verification. By clearly presenting a visual warning, the system ensures that clinicians are promptly informed of the issue, enabling them to take corrective action or initiate an authorised override if required.
[0020] In another aspect, the present disclosure provides a method for operating a robotic surgical system, comprises installing a surgical instrument onto at least one robotic arm of a robotic surgical system via a sterile adapter and retrieving a unique identifier and a usage count associated with the surgical instrument by reading a Near Field Communication (NFC) tag embedded in the surgical instrument using an NFC reader mounted on the at least one robotic arm. The method further comprises capturing and displaying images of a surgical site in real-time or near real-time using an endoscopic camera inserted into a patient’s body through a trocar and identifying a type of the surgical instrument from the captured images using a trained Artificial Intelligence (AI) model. The method further comprises performing a dual verification by comparing the identified type of the surgical instrument with the retrieved unique identifier and inhibiting continuation of the surgical procedure and generating an alert if the comparison between the retrieved identifier and the AI-identified instrument type indicates a mismatch or if the usage count exceeds an authorized limit. The method further comprises enabling a surgeon to control the at least one robotic arm to manipulate the validated surgical instrument upon successful completion of the dual verification. The method manifests all the advantages and technical effects of the robotic surgical system comprising the sub-system configured to validate and monitor usage of the surgical instrument of the present disclosure.
[0021] It is to be appreciated that all the aforementioned implementation forms can be combined.
[0022] It has to be noted that all devices, elements, circuitry, units and means described in the present application could be implemented in the software or hardware elements or any kind of combination thereof. All steps which are performed by the various entities described in the present application as well as the functionalities described to be performed by the various entities are intended to mean that the respective entity is adapted to or configured to perform the respective steps and functionalities. Even if, in the following description of specific embodiments, a specific functionality or step to be performed by external entities is not reflected in the description of a specific detailed element of that entity which performs that specific step or functionality, it should be clear for a skilled person that these methods and functionalities can be implemented in respective software or hardware elements, or any kind of combination thereof. It will be appreciated that features of the present disclosure are susceptible to being combined in various combinations without departing from the scope of the present disclosure as defined by the embodiments.
[0023] Additional aspects, advantages, features, and objects of the present disclosure would be made apparent from the drawings and the detailed description of the illustrative implementations construed in conjunction with the embodiments that follow.
BRIEF DESCRIPTION OF THE DRAWINGS
[0024] The summary above, as well as the following detailed description of illustrative embodiments, is better understood when read in conjunction with the appended drawings. For the purpose of illustrating the present disclosure, exemplary constructions of the disclosure are shown in the drawings. However, the present disclosure is not limited to specific methods and instrumentalities disclosed herein. Moreover, those in the art will understand that the drawings are not to scale. Wherever possible, like elements have been indicated by identical numbers.
[0025] Embodiments of the present disclosure will now be described, by way of example only, with reference to the following diagrams wherein:
FIG. 1 is a diagram illustrating a robotic surgical system, in accordance with an embodiment of the present disclosure;
FIG. 2 is a block diagram of a sub-system for validating and monitoring usage of a surgical instrument in a robotic surgical system, in accordance with an embodiment of the present disclosure;
FIGs. 3A and 3B are collectively a flowchart of a method for operating a robotic surgical system, in accordance with an embodiment of the present disclosure;
FIG. 4 is a process of an Artificial Intelligence (AI)-based instrument validation and usage monitoring system for a robotic surgical system, in accordance with an embodiment of the present disclosure;
FIG. 5 is a block diagram illustrating a surgical instrument validation process within a robotic surgical system, in accordance with an embodiment of the present disclosure; and
FIG. 6 is a block diagram illustrating an endoscopic video processing within a robotic surgical system, in accordance with an embodiment of the present disclosure.
[0026] In the accompanying drawings, an underlined number is employed to represent an item over which the underlined number is positioned or an item to which the underlined number is adjacent. A non-underlined number relates to an item identified by a line linking the non-underlined number to the item. When a number is non-underlined and accompanied by an associated arrow, the non-underlined number is used to identify a general item at which the arrow is pointing.
DETAILED DESCRIPTION OF EMBODIMENTS
[0027] The following detailed description illustrates embodiments of the present disclosure and ways in which they can be implemented. Although some modes of carrying out the present disclosure have been disclosed, those skilled in the art would recognize that other embodiments for carrying out or practicing the present disclosure are also possible.
[0028] FIG. 1 is a diagram illustrating a robotic surgical system, in accordance with an embodiment of the present disclosure. With reference to FIG. 1, there is shown a robotic surgical system 100 including a patient-side cart 110, a vision cart 120, and a surgeon console 130.
[0029] The patient-side cart 110 refers to a mobile platform and comprises a plurality of robotic arms. The patient-side cart 110 is configured to support the plurality of robotic arms positioned adjacent to a patient during surgical procedures. The patient-side cart 110 includes a base mounted on wheels. The patient-side cart 110 further includes a vertical column extending upward from the base. The plurality of robotic arms extends from the vertical column of the patient-side cart 110. In some implementations, the multiple robotic arms include four robotic arms in which three robotic arms 112 are configured for surgical instrument manipulation and one robotic arm 113 is configured for endoscopic imaging. The robotic arms include primary segments, secondary segments, and tertiary segments connected by rotational joints. The rotational joints contain servo motors, enabling precise angular positioning. The robotic arms may include surgical instrument holders 114 at distal ends. The surgical instrument holders 114 comprise mechanical interfaces and electrical connectors. In an implementation, the surgical instrument holders 114 include an actuator configured to attach at least one surgical instrument 140 to the surgical instrument holder 114 through the help of a sterile adapter. The mechanical interfaces include spring-loaded clamps for instrument attachment. The electrical connectors transmit power and signals to mounted instruments. The patient-side cart 110 further includes the at least one surgical instrument 140 mounted to the surgical instrument holders 114 at one of the three robotic arms 112. In the present configuration, a passive Near Field Communication (NFC) tag is installed on the at least one surgical instrument 140. Furthermore, the robotic surgical system 100 includes a sub-system 142 (shown and described in detail, for example, in FIG. 2) configured to validate and monitor usage of the at least one surgical instrument 140. The sub-system 142 may be either an independent unit or a part of the vision cart 120. The sub-system 142 includes a local edge computing device operatively connected to an NFC reader and an endoscopic camera 138 (which is held by the robotic arm 113). The NFC reader is configured to read the passive NFC tag embedded on the at least one surgical instrument 140 to retrieve a unique instrument identifier and usage count of the at least one surgical instrument 140, enabling real-time validation at the point of attachment. The at least one surgical instrument 140 includes elongated shafts with end effectors at distal tips. The end effectors include articulation mechanisms enabling pitch and yaw movements. The at least one surgical instrument 140 includes internal drive cables connecting to motor units in the surgical instrument holders 114. The internal drive cables actuate the end effector movements. The robotic arm 113 supports an endoscopic imaging system. Each of the three robotic arms 112 includes additional degrees of freedom for camera positioning. The endoscopic imaging system includes dual high-definition camera sensors (such as the endoscopic camera 138) mounted at a distal end of the robotic arm 113. The endoscopic camera 138 enables stereoscopic image capture. The endoscopic imaging system includes fibre optic light transmission bundles surrounding the camera sensors for illuminating the surgical field. The endoscopic imaging system enables both white light imaging and near-infrared fluorescence visualization. The endoscopic imaging system comprises glass rod lenses for controlling chromatic aberration and enhancing image quality.
[0030] The vision cart 120 is a mobile unit comprising a base with wheels and a vertical housing. The base contains power supply units and cooling systems. The vertical housing contains processing units and displays. The vertical housing includes ventilation channels for thermal management. The vision cart 120 includes a primary display 122 mounted at an upper portion of the vertical housing, wherein the primary display 122 comprises a high-definition LCD monitor with anti-glare coating. The vision cart 120 includes an Electro-Surgical Unit (ESU) 124 mounted within the vertical housing. The vision cart 120 further includes endoscope light sources. The endoscope light sources comprise one or two light source units mounted within the vertical housing. The vision cart 120 includes an insufflator unit mounted within the vertical housing for creating and maintaining pneumoperitoneum. The vision cart 120 includes an Uninterruptible Power Supply (UPS) system mounted within the base for providing backup power. The vision cart 120 further includes a video processing unit and a central processing unit within the vertical housing. The video processing unit includes dedicated graphics processors. The central processing unit comprises multiple processing cores. The vision cart 120 further includes data storage devices mounted within the vertical housing. In some implementations, the vision cart 120 comprises image enhancement processors for contrast adjustment and noise reduction. In some implementations, the vision cart 120 includes fluorescence imaging processors for tissue identification. In some implementations, the vision cart 120 includes augmented reality processors for data overlay generation.
[0031] The surgeon console 130 includes a base structure supporting an operator seat and control interfaces. The base structure includes levelling mechanisms for stable positioning. The operator seat comprises height adjustment mechanisms and lumbar support systems. A display housing extends upward and forward from the base structure. The display housing contains a stereoscopic display system 134 for displaying real-time surgical site images in high resolution, providing enhanced depth perception and clarity. The stereoscopic display system 134 includes dual display panels and optical elements. The optical elements include focusing mechanisms and eye tracking sensors. In an implementation, the stereoscopic display system 134 is secured by a monitor mounting assembly 135, which enables adjustable positioning for optimal viewing angles. The monitor mounting assembly 135 may be referred to as an adjustable support system that securely holds and enables the controlled positioning of the stereoscopic display system 134 within a workstation or operational environment. The surgeon console 130 further includes master control manipulators 132 mounted on sides of the base structure in front of the operator seat. The master control manipulators 132 include primary arms, secondary arms, and tertiary arms connected by joints. The joints include force feedback actuators and position sensors. The master control manipulators 132 terminate in ergonomic hand grips. The hand grips contain pressure sensors and multi-function triggers. In some implementations, the surgeon console 130 further includes foot pedals 136 mounted on a lower portion of the base structure. The foot pedals 136 include position sensors and tactile feedback mechanisms. The user interface comprising touchscreens mounts on the base structure between the master control manipulators 132. The touchscreens display system status information and configuration controls. In some implementations, the surgeon console 130 may also receive output from the sub-system 142, which is configured to validate and monitor the surgical instrument usage.
[0032] The sub-system 142 comprises the local edge computing device with the processor operatively connected to an NFC reader and the endoscopic camera 138. The processor executes a trained AI model to identify surgical instrument types from real-time video streams while simultaneously reading unique instrument identifiers from embedded NFC tags. The sub-system 142 performs dual verification by comparing the AI-identified instrument type with the NFC-retrieved identifier, transmitting metadata to a centralized cloud database, and inhibiting surgical procedures when mismatches are detected. The results of the validation process, along with usage count and instrument status may be displayed directly on the surgeon console 130 enabling the surgeon to receive real-time feedback on instrument authenticity, safety, and readiness during the surgical procedure.
[0033] The patient-side cart 110, the vision cart 120, and the surgeon console 130 connect through a communication network. In an implementation, the communication network may be through wired or wireless communication protocol. In an implementation, the communication between the patient-side cart 110, the vision cart 120, and the surgeon console 130 is established through either through Ethernet for Control Automation Technology (EtherCAT) or Ethernet. The communication network comprises fibre optic cables for high-speed data transmission. The communication network includes redundant data pathways. The communication network transmits control signals from the master control manipulators 132 to the three robotic arms 112. The control signals include position commands and gripper actuation commands. In some implementations, the communication network transmits imaging data from the endoscopic imaging system to the stereoscopic display system 134. The imaging data includes calibration parameters and camera position data. The robotic surgical system 100 includes monitoring systems connected to the communication network. The monitoring systems comprise voltage sensors, current sensors, temperature sensors, and position sensors.
[0034] The endoscopic camera 138 may be referred to as an imaging component configured to capture high-resolution video of the surgical site in real-time or near real-time during robotic-assisted procedures. The endoscopic camera 138 is mounted at the distal end of the robotic arm 113 and includes dual image sensors to provide stereoscopic visualization, enabling depth perception and enhanced spatial awareness for the operating surgeon. The endoscopic camera 138 is further equipped with integrated illumination components, such as fibre optic light bundles, to ensure a well-lit surgical field under various imaging conditions. In some implementations, the endoscopic camera 138 supports both white light imaging and near-infrared fluorescence imaging for improved tissue differentiation. The captured video stream from the endoscopic camera 138 may be transmitted to the local edge computing device for processing and can be displayed on the surgeon console 130.
[0035] The sub-system 142 may be referred to as an intelligent validation and monitoring module integrated within the robotic surgical system 100, configured to ensure the correct identification and safe usage of surgical instruments. The sub-system 142 includes the local edge computing device that includes the processor operatively connected to the NFC reader and the endoscopic camera 138. The sub-system 142 performs dual verification by comparing the AI-identified instrument type with the identifier retrieved from the NFC tag to ensure consistency and prevent mismatched or unauthorized instrument use. In some implementations, the sub-system 142 also transmits validated instrument metadata to a centralized cloud database over a secure communication protocol, enabling global usage tracking and enforcement of usage limits.
[0036] In some implementations, the robotic surgical system 100 executes autonomous and semi-autonomous functions. In some implementations, the robotic surgical system 100 enables system upgrades through modular component replacement. The modular component replacement includes instrument interface upgrades and processing unit upgrades.
[0037] The robotic surgical system 100 enables minimally invasive surgical procedures. Exemplary surgical procedures may include, but not limited to, general surgery procedures, gynaecological procedures, urological procedures, cardiothoracic procedures, and otolaryngological procedures.
[0038] FIG. 2 is a block diagram of a sub-system for validating and monitoring usage of a surgical instrument in a robotic surgical system, in accordance with an embodiment of the present disclosure. FIG. 2 is described in conjunction with elements of FIG. 1. With reference to FIG. 2, there is shown a block diagram 200 of the sub-system 142 in the robotic surgical system 100. The sub-system 142 includes a local edge computing device 202. The local edge computing device 202 comprises a processor 204 and a memory 206. The processor 204 is configured to execute a trained Artificial Intelligence (AI) model 208 stored in the memory 206. Additionally, the processor 204 is operatively connected to a Near Field Communication (NFC) reader 210 and the endoscopic camera 138. Further, there is shown that the local edge computing device 202 is connected to a centralized cloud database 212 through a communication network 214.
[0039] The local edge computing device 202 may be referred to as a dedicated processing unit configured to perform real-time data acquisition, analysis, and decision-making within the robotic surgical system 100. The local edge computing device 202 includes the processor 204 operatively connected to both the NFC reader 210 and the endoscopic camera 138, enabling the processor 204 to simultaneously retrieve digital instrument metadata and analyze visual input from the surgical site. The local edge computing device 202 may incorporate secure memory modules for temporary data storage and hardware-accelerated AI processing capabilities for executing trained machine learning models. Additionally, the local edge computing device 202 may be designed to operate either independently as a separate unit or in conjunction with the surgeon console 130 and the vision cart 120, facilitating seamless integration without disrupting existing workflows. Examples of the local edge computing device 202 may include but are not limited to a computer, a smart phone, a tablet, a portable electronic device, a computing device, and the like.
[0040] The processor 204 may be referred to as a core computational component designed to execute data processing, instrument validation, and decision-making tasks within the local edge computing device 202. The processor 204 includes one or more processing cores configured to handle simultaneous input from the NFC reader 210 and the endoscopic camera 138, enabling real-time retrieval and analysis of the surgical instrument-related data. Additionally, the processor 204 is designed to run the trained AI model 208, capable of identifying the type of the surgical instrument from the captured video stream. The processor 204 may further include integrated memory controllers and secure communication interfaces to support fast data exchange and encrypted transmission of metadata to the centralized cloud database 212. The processor 204 is optimized for low-latency execution and is designed to integrate seamlessly with other hardware components within the sub-system 142, ensuring continuous and reliable validation of surgical instruments throughout the procedure. In an embodiment, the NFC data associated with the at least one surgical instrument 140 from a robotic arm cart may be transmitted directly to the surgeon console 130, while the endoscopic image data may be routed through the vision cart 120 before reaching the surgeon console 130. In other implementations, both the NFC metadata and the endoscopic camera 138 data may be first collected at the vision cart 120 and then forwarded to the surgeon console 130 for processing and AI-based verification. This configuration enables the processor 204 to access both the digital instrument metadata and real-time visual input from the surgical site, facilitating synchronized validation and monitoring of the at least one surgical instrument 140.
[0041] The memory 206 may be referred to as a data storage component configured to temporarily or permanently store information required for the operation of the local edge computing device 202 within the robotic surgical system 100. The memory 206 includes non-volatile memory for retaining essential firmware, configuration data, and AI model parameters, as well as volatile memory for storing real-time input data, such as video frames from the endoscopic camera 138 and the surgical instrument metadata retrieved from the NFC reader 210. Additionally, the memory 206 may include secure storage zones to protect sensitive information, such as instrument usage records and unique identifier of the surgical instrument, prior to transmission to the centralized cloud database 212. The memory 206 is designed to interface seamlessly with the processor 204, enabling efficient data retrieval and storage operations that contribute to the system’s responsiveness, reliability, and data integrity during robotic-assisted surgical procedures.
[0042] The trained AI model 208 may be referred to as a machine learning-based model configured to perform real-time visual recognition of surgical instruments within the robotic surgical system 100. The trained AI model 208 is embedded within the local edge computing device 202 and is designed to analyze video streams captured by the endoscopic camera 138 to identify the type of surgical instrument in use based on its visual characteristics, such as shape, size, jaw configuration, and movement pattern. Examples of the trained AI model 208 may include but are not limited to, a Convolutional Neural Network (CNN) model, a Recurrent Neural Network (RNN) model, a transformer-based model, any other deep learning architecture-based model including an object detection and segmentation model, a hybrid and specialized architecture-based model and the like, optimized for image classification. The output generated by the trained AI model 208 is used for dual verification, where it is compared against the instrument identifier retrieved from the NFC tag to confirm consistency.
[0043] The NFC reader 210 may be referred to as a wireless communication component configured to retrieve instrument-specific data from an embedded NFC tag located on the surgical instrument (e.g., the at least one surgical instrument 140) within the robotic surgical system 100. The NFC reader 210 is operatively connected to the local edge computing device 202 and is designed to establish short-range, contactless communication with the NFC tag when the instrument is mounted to one of the three robotic arms 112 (of FIG. 1). The NFC reader 210 may support secure data transmission protocols to prevent unauthorized access or tampering and is designed to operate reliably within the sterile and electromagnetically active environment of the operating room. In some implementations, the NFC reader 210 may be integrated directly into the surgical instrument holder to allow seamless interaction with the surgical instrument during mounting.
[0044] The centralized cloud database 212 may be referred to as a secure, network-accessible data storage platform configured to store, manage, and synchronize surgical instrument metadata across multiple robotic surgical systems and healthcare facilities. The centralized cloud database 212 receives validated instrument information, including unique identifiers, usage counts, part numbers, and verification results from the local edge computing device 202 via encrypted communication protocols. In some implementations, the centralized cloud database 212 may include role-based access controls, audit logs, and cybersecurity features, such as data encryption and anomaly detection to ensure data integrity, confidentiality, and compliance with healthcare data standards.
[0045] The communication network 214 may be referred to as a secure and scalable data transmission infrastructure configured to enable real-time and bi-directional communication between the local edge computing device 202 and the centralized cloud database 212. The communication network 214 comprises wired or wireless connectivity interfaces, including Ethernet, Wi-Fi, 4G/5G, or other IP-based protocols, and is integrated with encrypted communication layers, such as Transport Layer Security (TLS) to ensure data privacy, integrity, and authentication during transmission. The communication network 214 facilitates the transfer of surgical instrument metadata, including unique identifiers, usage counts, and validation results, from the local edge computing device 202 to the centralized cloud database 212 for remote logging, compliance auditing, and cross-facility synchronization.
[0046] In operation, there is provided the robotic surgical system 100 comprising the patient-side cart 110 comprising a plurality of robotic arms, where at least one of the plurality of robotic arms is configured to hold a surgical instrument. The robotic surgical system 100 further comprises the surgeon console 130 configured to receive inputs from a surgeon to control the plurality of robotic arms and the vision cart 120 configured to process and display images from a surgical site. The robotic surgical system 100 comprising the patient-side cart 110, the surgeon console 130 and the vision cart 120 have been shown and described in detail, for example, in FIG. 1. The robotic surgical system 100 further comprises the sub-system 142 configured to validate and monitor the usage of the surgical instrument (i.e., the at least one surgical instrument 140). The sub-system 142 comprises the local edge computing device 202. The local edge computing device 202 acts as a central processing unit within the sub-system 142. The local edge computing device 202 executes the required logic and operations required to validate the surgical instrument and monitor its usage count. By processing data locally and in real time, the local edge computing device 202 allows the sub-system 142 to make immediate decisions about whether the surgical instrument is authorized for use or has exceeded its operational lifespan.
[0047] The local edge computing device 202 comprising the processor 204 is operatively connected to the NFC reader 210 configured to read a passive NFC tag embedded on the surgical instrument to retrieve a unique instrument identifier (ID) and a usage count of the surgical instrument. When the surgical instrument is connected, the NFC reader 210 activates and reads the stored metadata including the instrument’s unique ID and the number of times the surgical instrument has been used from the embedded NFC tag. The processor 204 then processes this data in real time, allowing the sub-system 142 to validate the instrument’s identity and check whether it is still within its approved usage limit. The inclusion of the processor 204 and the NFC reader 210 within the local edge computing device 202 serves a significant role in ensuring instrument traceability and safe usage compliance. The surgical instruments have limited lifespans due to wear and mechanical fatigue. The accurate tracking of usage is required to prevent instruments from being used beyond their safe operational limit. Additionally, retrieving a unique instrument ID ensures that the surgical instrument is correctly recognized by the sub-system 142, reducing the risk of mismatched configurations or unauthorized substitutions.
[0048] Furthermore, the processor 204 is operatively connected to the endoscopic camera 138 which is configured to capture a video stream of the surgical site in real-time or near real-time. The processor 204 receives the video stream directly from the endoscopic camera 138 through a high-speed communication interface. Once the data is received, the processor 204 can perform various processing tasks, including running the trained AI model 208 to identify the instrument type based on its visual features such as shape, orientation, or jaw configuration. The video feed is continuously analyzed in real-time or near real-time, allowing the processor 204 to extract actionable insights without delay. The integration between the processor 204 and the endoscopic camera 138 enables the sub-system 142 to validate surgical instrument usage, enhance situational awareness, and provide accurate visual data to the surgeon.
[0049] The processor 204 is configured to execute the trained AI model 208 to identify a type of the surgical instrument in real-time or near real-time from the endoscopic video stream. The trained AI model 208, which has been trained on various images of surgical instruments, uses techniques, such as deep learning and computer vision to recognize instrument-specific features like shape, size, jaw type, and articulation mechanisms. As the endoscopic video stream is analyzed in real-time or near real-time, the trained AI model 208 outputs the identified instrument type. The processor 204 then uses this output to compare it with other identification sources (e.g., the NFC reader 210 data) to confirm the instrument’s authenticity. By enabling the processor 204 to run the trained AI model 208 that independently analyses visual characteristics, the sub-system 142 performs a second layer of verification that improves safety, reduces human error, and ensures that the physical instrument matches its digital configuration.
[0050] The processor 204 is further configured to perform a dual verification by comparing the type of the surgical instrument identified by the trained AI model 208 with the unique instrument ID retrieved from the passive NFC tag. During the surgical instrument setup, the NFC reader 210 retrieves the unique identifier and usage count stored on the instrument’s NFC tag, and this data is passed to the processor 204. At the same time, the endoscopic camera 138 captures a live video stream of the surgical site, which is analyzed by the processor 204 running the trained AI model 208 to visually identify the instrument type. The processor 204 then compares the AI-detected instrument type against the unique instrument ID retrieved from the NFC reader 210. If the results match, the sub-system 142 proceeds with the surgical procedure, if there is a mismatch, the processor 204 can trigger a warning or inhibit the surgical procedure until the discrepancy is resolved. The dual verification ensures that only validated instruments within their approved usage limits are used for robotic surgical procedures.
[0051] The processor 204 is further configured to transmit metadata associated with the surgical instrument to the centralized cloud database 212 over a secure communication protocol, where the centralized cloud database 212 is configured to maintain records of the surgical instrument usage across multiple surgical systems and facilities, and to prevent reuse of the surgical instrument beyond authorized limits. After validating the surgical instrument using data from the passive NFC tag and visual identification via the trained AI model 208, the processor 204 compiles relevant metadata associated with the surgical instrument. The metadata is formatted and transmitted to the centralized cloud database 212 through a secure communication protocol, such as Hyper Text Transfer Protocol Secure (HTTPS) or another encrypted channel that ensures data confidentiality and integrity. The centralized cloud database 212 logs the data in real time, updating the instrument’s usage history and verifying whether the current use remains within approved limits. If the surgical instrument is found to exceed its allowed usage limits, the centralized cloud database can notify the processor 204 to restrict further use. By transmitting metadata to the centralized cloud database 212, the sub-system 142 ensures that usage data is globally accessible, and protected against unauthorized modification.
[0052] Furthermore, the processor 204 is configured to inhibit continuation of a surgical procedure and raise an alert if the unique instrument identifier and the AI-identified instrument type do not match, or if the usage count of the surgical instrument exceeds the authorized limit, thereby enhancing patient safety by preventing use of an incorrect or unauthorized surgical instrument. During the surgical instrument setup, the processor 204 simultaneously receives input from two sources that is the NFC reader 210, which provides the instrument's unique identifier and metadata, and the trained AI model 208, which processes the endoscopic video stream to determine the actual instrument type in use. The processor 204 compares these two results in real time. If the AI-identified instrument type does not match the type associated with the NFC-retrieved identifier, the processor 204 initiates a control signal to pause or inhibit the surgical procedure and triggers a visual or audible alert for example, on the surgeon console 130. In another case, if the AI-identified instrument type matches the instrument type associated with the NFC-retrieved unique identifier but the usage count of the surgical instrument has exceeded the authorized limit then, in such case, the surgical procedure is also inhibited. The immediate feedback mechanism ensures that no surgical action proceeds until the mismatch is resolved, thereby maintaining procedural integrity and safeguarding the patient.
[0053] The robotic surgical system 100 is configured to perform dual validation of the at least one surgical instrument 140 before allowing its use in a surgical procedure. Once the surgical instrument is installed onto a robotic arm (i.e., the robotic arm 113) via a sterile adapter, the first validation step begins with NFC-based identification. The NFC reader 210, positioned at the base of the robotic arm (i.e., the robotic arm 113), reads data from the NFC tag embedded in the surgical instrument (i.e., the at least one surgical instrument 140), retrieving a unique identifier and usage count. The NFC data is transmitted through the robotic arm cart to the surgeon console 130, where the local edge computing device 202 comprising the processor 204 is configured to analyze the NFC data. In an implementation scenario, a control software installed on the surgeon console 130 may be used to process the NFC data. The surgical instrument is then inserted into the patient’s body through a trocar, while the endoscopic camera 138 is also inserted into the patient’s body through another trocar (or a separate trocar), captures real-time or near real-time images of the surgical instrument inside the patient’s body. When the instrument’s tip enters the camera’s field of view, the visual feed is sent to the vision cart 120 and then to the surgeon console 130, where artificial intelligence performs the second validation by identifying the instrument type based on its visual characteristics. Only after both the AI-identified instrument type matches the NFC identifier and the usage count is within the authorized limit, the robotic surgical system 100 allows the surgeon to proceed with manipulating the surgical instrument.
[0054] In accordance with an embodiment, the trained AI model 208 is configured to classify the surgical instrument based on tip geometry and articulation features. The trained AI model 208 analyses frames from the endoscopic video stream and extracts spatial features, such as the contours, angles, and articulation patterns of the instrument tip. These features are then passed through multiple layers of the trained AI model 208 (e.g., convolutional, pooling, and fully connected layers in case of using the CNN model), which have been trained on a dataset of labelled instrument images. Based on learned patterns, the trained AI model 208 outputs a predicted instrument type with a confidence score. The classification result is then used by the processor 204 to perform dual verification against data retrieved from the passive NFC tag. The use of the trained AI model 208 ensures the generalization across variations in angle, lighting, and partial occlusion, enabling robust instrument recognition in real-world surgical scenarios.
[0055] In accordance with an embodiment, the processor 204 is configured to periodically re-train the trained AI model 208 using updated labelled imaging datasets of surgical instruments comprising graspers, scissors, needle holders, energy devices, staplers, and suction-irrigation tools. The processor 204 manages the re-training process by accessing updated labelled datasets, collections of surgical instrument images that have been annotated with their correct classifications. The labelled imaging datasets may be generated locally, received from connected surgical systems, or sourced from the centralized cloud database 212. During re-training, the processor 204 uses the labelled examples to fine-tune or augment the parameters of the trained AI model 208. This may involve updating weights, adjusting network architecture, or expanding the classification categories. The re-training process can be performed during system downtime or offloaded to a connected training server, with the updated model subsequently deployed back to the processor 204. The periodic re-training allows the trained AI model 208 to learn from new examples and edge cases, thereby enhancing its ability to make accurate and reliable classifications during live surgical procedures.
[0056] In accordance with an embodiment, the NFC tag is further configured to store additional metadata comprising at least one of: manufacturer identifier, sterilization history, model number of the surgical instrument, or expiration data. The NFC tag, which is integrated into the surgical instrument, is programmed to store the additional metadata during manufacturing or sterilization processing. When the surgical instrument is loaded into the robotic surgical system 100, the NFC reader 210 retrieves the stored information wirelessly and passes it to the processor 204 for validation. The processor 204 may display this data on the surgeon console 130 for review, cross-check it with centralized records, or log it to the centralized cloud database 212 for audit and compliance purposes. The NFC tag’s storage is typically configured to be read-only after a certain point to prevent unauthorized modification, ensuring data integrity. By storing the additional metadata of each surgical instrument, the robotic surgical system 100 provides end-to-end visibility and validation, helping meet clinical, operational, and regulatory standards.
[0057] In accordance with an embodiment, the secure communication protocol comprises an encrypted connection established using Transport Layer Security (TLS) and a message queuing protocol configured to transmit metadata to the centralized cloud database 212. The TLS is a widely adopted cryptographic protocol that ensures data transmitted over the network is encrypted and protected from unauthorized access or tampering. The message queuing protocol, such as Message Queuing Telemetry Transport (MQTT) or Advanced Message Queuing Protocol (AMQP) is designed to reliably transmit data packets (messages) between system components in a structured, asynchronous, and scalable manner. When the processor 204 prepares to transmit validated metadata, such as instrument ID, usage count, and validation status, the processor 204 first establishes a TLS-encrypted connection to the centralized cloud database 212. The encrypted channel ensures that all transmitted data remains confidential and tamper-proof during network communication. The metadata is then formatted into discrete messages and transmitted using the message queuing protocol, which organizes the data into a queue and ensures delivery even if network conditions are unstable. In some implementations, the message queuing protocol may include delivery confirmation, retry logic, and time-stamping to ensure the data is accurate, complete, and auditable.
[0058] In accordance with an embodiment, the processor 204 is further configured to locally store a temporary log of instrument verification events within the local edge computing device 202 prior to transmission to the centralized cloud database 212. As the processor 204 executes the dual verification process, comparing the AI-identified instrument type with the passive NFC tag data including the unique instrument identifier of the surgical instrument, the processor 204 simultaneously writes a record of each verification event into the temporary log stored within the local memory (e.g., the memory 206) of the local edge computing device 202. The temporary log may include structured entries, such as the instrument ID, verification result, timestamp, and system response (e.g., approval case of successful dual verification match or alert in case of instrument type mismatch or usage limit exceed). The processor 204 manages the temporary log using the local database or file-based storage system, and may be configured to retain entries for a predefined duration or until successful synchronization with the centralized cloud database 212 is confirmed. Once a secure connection is established, the processor 204 transmits the logged data to the centralized cloud database 212 using the secure communication protocol. After a successful upload and acknowledgment, the local log can be cleared or archived.
[0059] In accordance with an embodiment, the centralized cloud database 212 is further configured to generate a usage report or trigger an alert when the surgical instrument approaches a predefined maximum usage threshold. When the surgical procedure is performed, the metadata including the instrument’s identifier and the usage count is transmitted by the processor 204 to the centralized cloud database 212. The centralized cloud database 212 continuously tracks and aggregates this data for every individual surgical instrument. When the surgical instrument’s usage approaches a predefined threshold (e.g., 80–90% of its allowable use), the database automatically performs a rule-based check. If the threshold is met or exceeded, the sub-system 142 either generates a usage report, which may be accessed through a dashboard or sent via email or triggers an alert, such as a visual warning in the user interface or a notification to the hospital’s instrument management system. In some implementations, the alert may also be communicated back to the surgeon console 130 or the vision cart 120, ensuring real-time awareness within the operating room.
[0060] In accordance with an embodiment, the centralized cloud database 212 is configured to store instrument-specific data recorded at the time of manufacturing, where the instrument-specific data includes one or more of: a unique instrument identifier, an instrument type and an authorized usage count, and thereby enabling traceability and post-operative verification of the surgical instrument in case of the system malfunction or suspected tampering with the sub-system. The data stored at the centralized cloud database 212 includes a unique instrument identifier, such as a serial number or globally unique ID, the instrument type (for example, a needle driver, grasper, or energy device), and an authorized usage count which specifies in how many surgical procedures the surgical instrument has safely participated before disposal. At the manufacturing stage, where metadata for each instrument is digitally registered and uploaded to the centralized cloud database 212 via secure, encrypted channels (e.g., HTTPS/TLS). The centralized cloud database 212 may be implemented using a secure and scalable service designed to ensure high availability and integrity of the data. When a surgical procedure is initiated, the processor 204 in the local edge computing device 202 retrieves real-time instrument data from the NFC tag and the trained AI model 208. The processor 204 in the local edge computing device 202 compares this real-time data with the corresponding manufacturing record from the centralized cloud database 212. If a mismatch is detected or if no such record exists for the attached instrument, the system may raise an alert, block the procedure, or log the incident for review. Additionally, the cloud-stored manufacturing metadata can be retrieved post-operatively to validate whether the correct surgical instrument is used, especially when analyzing any unexpected patient outcomes or potential system anomalies.
[0061] In accordance with an embodiment, the sub-system 142 is further configured to permit an override of the surgical procedure inhibition upon confirmation from the surgeon through a secured interface. When a mismatch is detected, such as a discrepancy between the instrument ID read from the passive NFC tag and the AI-identified instrument type, the sub-system 142 initiates a procedure lockout to prevent further operation. Simultaneously, a notification is displayed on the secured interface, typically located on the surgeon console 130 or on the vision cart 120. The interface prompts the surgeon to review the mismatch details and confirm whether to proceed despite the warning. The secured interface may require user authentication, such as a password, biometric input, or hardware token, to validate the surgeon’s identity and authorization level. Upon confirmation, the sub-system 142 logs the override event and temporarily lifts the inhibition, allowing the procedure to continue. All override actions are securely recorded in the system’s local log and transmitted to the centralized cloud database 212 for audit and compliance tracking.
[0062] In accordance with an embodiment, the endoscopic camera 138 is a stereoscopic camera configured to capture three-dimensional images of the surgical instrument and the surgical site. Capturing 3D images during robotic-assisted surgery is essential for providing the surgeon with spatial awareness, depth perception, and precise visualization of the operative field. The stereoscopic camera is mounted on a dedicated robotic arm (e.g., the robotic arm 113) and positioned to provide a clear view of the surgical site and instruments in use. The stereoscopic camera captures two simultaneous image streams from slightly different angles, which are transmitted to the local edge computing device 202 for processing. The image streams can be fused to render a 3D representation of the scene, which is displayed on the stereoscopic monitor at the surgeon console 130 or the vision cart 120 for enhanced visualization. Additionally, the 3D image data is used by the processor 204 to execute the trained AI model 208, which analyses tip geometry, depth cues, and articulation patterns to identify the instrument type with high accuracy. The integration of stereoscopic imaging improves both human and machine interpretation of the surgical environment, resulting in safer and more efficient procedures.
[0063] In accordance with an embodiment, the vision cart 120 is further configured to display a visual alert on a user interface when a mismatch is detected during the dual verification. Displaying the visual alert ensures real-time situational awareness and patient safety. The mismatch in instrument identification can indicate potential issues, such as loading of the wrong surgical instrument, the passive NFC tag tampering, or software configuration errors. If a mismatch is detected, the processor 204 triggers a visual alert that is transmitted to the vision cart 120, where it is rendered on a user interface, usually a touchscreen or display panel. The visual alert may include key details, such as the instrument ID, the AI-identified type, the nature of the mismatch, and recommended next steps. The user interface may also include interactive elements, such as acknowledgment buttons or override request options. By clearly presenting a visual warning, the sub-system 142 ensures that clinicians are promptly informed of the issue, enabling them to take corrective action or initiate an authorized override if required.
[0064] FIGs. 3A and 3B are collectively a flowchart of a method for operating a robotic surgical system, in accordance with an embodiment of the present disclosure. FIGs. 3A and 3B are described in conjunction with elements from FIGs. 1 and 2. With reference to FIG. 3A and 3B, there is shown a method 300 for operating the robotic surgical system 100. The method 300 includes steps 302 to 314. The processor 204 of the local edge computing device 202 is configured to execute the method 300.
[0065] The method 300 for operating the robotic surgical system 100 introduces a robust, automated instrument validation workflow that enhances patient safety and procedural integrity. The method 300 begins with the installation of the surgical instrument onto a robotic arm via a sterile adapter, followed by the retrieval of a unique identifier and usage count using the NFC reader 210 integrated into the robotic arm. Simultaneously, the endoscopic camera 138 inserted through a trocar captures real-time or near real-time images of the surgical instrument at the surgical site. The trained AI model 208 then configured to analyze the video stream to identify the type of surgical instrument based on its visual characteristics. The method 300 comprises performing a dual verification by comparing the AI-identified instrument type with the identifier retrieved from the NFC tag. If a mismatch is detected or the instrument has exceeded its authorized usage limit, the method 300 comprises inhibiting further use of the surgical instrument and raising an alert. Only upon successful validation is the surgeon permitted to control the robotic arm and proceed with the procedure. This method offers significant advantages, including prevention of unauthorized or incorrect instrument usage, real-time verification with minimal workflow disruption, and enhanced traceability through automated data logging.
[0066] Referring to FIG. 3A, at step 302, the method 300 comprises installing a surgical instrument onto at least one robotic arm of the robotic surgical system 100 via a sterile adapter. The step 302 initiates the instrument setup by securely attaching the surgical instrument to the robotic arm (e.g., the robotic arm 113) in a sterile operating environment. The sterile adapter acts as a mechanical and electrical interface between the robotic arm’s end effector and the surgical instrument, ensuring contamination-free transfer of force, signals, and power.
[0067] At step 304, the method 300 further comprises retrieving a unique identifier and a usage count associated with the surgical instrument by reading a Near Field Communication (NFC) tag embedded in the surgical instrument using an NFC reader mounted on the at least one robotic arm. The step 304 enables the robotic surgical system 100 to automatically identify and monitor the surgical instrument being used by accessing essential metadata stored in the embedded NFC tag. The NFC reader 210, integrated into the robotic arm, wirelessly reads the NFC tag once the instrument is mounted, extracting data such as the instrument’s unique ID, usage count, and potentially other attributes, like model number or sterilization history. The retrieval step occurs in real-time or near real-time and forms the basis for validating the surgical instrument’s authorization and lifecycle status.
[0068] At step 306, the method 300 comprises capturing and displaying images of a surgical site in real-time or near real-time using the endoscopic camera 138 inserted into a patient’s body through a trocar. The step 306 enables continuous visualization of the internal anatomy during robotic-assisted surgery. The endoscopic camera 138, mounted on the dedicated robotic arm (i.e., the robotic arm 113), is guided through the trocar separate from the trocar through which the surgical instrument is inserted into the patient’s body, a specialized surgical port that provides access to internal cavities, ensuring a minimally invasive approach. Once inside, the endoscopic camera 138 captures high-definition, stereoscopic images of the surgical instrument and transmits them to the 3D display monitor on the surgeon console 130, allowing the surgeon to view the site with depth perception and precision.
[0069] At step 308, the method 300 comprises identifying a type of the surgical instrument from the captured images using the trained AI model 208. The step 308 involves analyzing video frames of the surgical instrument captured in real-time or near real-time by the endoscopic camera 138, to determine the instrument type based on its visual features. The trained AI model 208, such as the CNN model, or the RNN model, or the transformer-based model, or any other neural network architecture-based model, and the like, is executed on the local edge computing device 202, which processes the visual data to classify the instrument type by recognizing characteristics, such as tip geometry, articulation pattern, and structural profile. Other examples of the trained AI model 208 may include but are not limited to, a Vision Transformer (ViTs) based model, You Only Look Once (YOLO)-based model (e.g., YOLOv5 or YOLOv7), Mobile Convolutional Neural Network (MobileNet) model, Efficient Convolutional Neural Network (EfficientNet) model, or a hybrid architecture based model, such as a Residual Network (ResNet) model combined with a Long Short-Term Memory (LSTM) network based model. The aforementioned AI models may be selected and trained based on system requirements for real-time inference, accuracy, and hardware compatibility.
[0070] At step 310, the method 300 comprises performing a dual verification by comparing the identified type of the surgical instrument with the retrieved unique identifier. The step 310 involves cross-referencing two independent sources of instrument identification: the type of the surgical instrument determined through AI-based visual classification and the unique instrument identifier read from the passive NFC tag embedded on the surgical instrument through the NFC reader 210. The trained AI model 208 analyses the instrument's visual characteristics captured by the endoscopic camera 138, while the NFC reader 210 extracts metadata, such as the instrument ID and pre-associated type. The processor 204 compares these two data points to verify whether they correspond to the same surgical instrument or not.
[0071] At step 312, the method 300 comprises inhibiting continuation of the surgical procedure and generating an alert if the comparison between the retrieved identifier and the AI-identified instrument type indicates a mismatch or if the usage count exceeds an authorized limit. The step 312 enables use of the correct and approved surgical instruments within their validated lifecycle. If the dual verification process detects a discrepancy between the instrument type identified by the trained AI model 208 (based on the live endoscopic video feed) and the unique identifier retrieved from the embedded NFC tag, or if the usage count recorded on the NFC tag exceeds its predefined threshold, the method 300 immediately halts the ongoing procedure. Simultaneously, a warning alert is triggered and displayed on the user interface to notify the surgical team or the surgeon.
[0072] At step 314, the method 300 comprises enabling a surgeon to control the at least one robotic arm to manipulate the validated surgical instrument upon successful completion of the dual verification. Once the type of surgical instrument is identified through AI-based visual recognition, matches the unique identifier retrieved from the embedded NFC tag, and that its usage count is within the authorized limit, control of the validated instrument is handed over to the surgeon. The surgeon, operating from the surgeon console 130, can then precisely manipulate the robotic arm 113 and perform surgical tasks such as grasping, cutting, suturing, or dissecting, using the verified instrument.
[0073] The steps 302 to 314 are only illustrative, and other alternatives can also be provided where one or more steps are added, or one or more steps are provided in a different sequence without departing from the scope of the claims herein.
[0074] There is provided a computer program product comprising instructions for performing the method 300 when executed by one or more processors (e.g., the processor 204) in the local edge computing device 202 of the robotic surgical system 100. The computer program is implemented as an algorithm, embedded in a software stored in the non-transitory computer-readable storage medium having program instructions stored thereon, the program instructions being executable by the one or more processors in the computer system to execute the method 300. The non-transitory computer-readable storage means may include, but are not limited to, an electronic storage device, a magnetic storage device, an optical storage device, an electromagnetic storage device, a semiconductor storage device, or any suitable combination of the foregoing. Examples of implementation of computer-readable storage medium, but are not limited to, an Electrically Erasable Programmable Read-Only Memory (EEPROM), a Random Access Memory (RAM), a Read Only Memory (ROM), a Hard Disk Drive (HDD), a Flash memory, a Secure Digital (SD) card, a Solid-State Drive (SSD), a computer-readable storage medium, and/or a CPU cache memory.
[0075] In accordance with an embodiment, the method 300 comprises transmitting metadata associated with the surgical instrument to the centralized cloud database 212 over a secure communication protocol. The metadata is transmitted to the centralized cloud database 212 to maintain a permanent and accessible record of the instrument’s operational history across multiple surgeries and facilities. The transmission is carried out using a secure communication protocol, such as Transport Layer Security (TLS), to protect the confidentiality and integrity of the data during transfer. The metadata is first collected and processed by the local edge computing device 202 (e.g., an industrial PC running the control software), and then transmitted to the centralized cloud database 212 using a secure cloud gateway. Event-driven functions hosted on the cloud may be used to receive the metadata, trigger alert generation, or log the metadata in structured cloud databases.
[0076] Furthermore, the method 300 comprises maintaining records of the surgical instrument usage across multiple robotic systems and surgical facilities within the centralized cloud database 212, and restricting reuse of the surgical instrument beyond an authorized usage limit based on the maintained records. Upon successful validation of the surgical instrument, the information is securely transmitted to the centralized cloud database 212 using encrypted communication protocols (e.g., TLS over MQTT protocol). The centralized cloud database 212 logs this data and updates the cumulative usage record for the surgical instrument. Once an instrument reaches its authorized usage threshold, a rule engine or cloud function flags it and sends an update back to the robotic system. The control software (i.e., the processor 204) on the robotic surgical system 100, upon receiving this status, automatically inhibits reuse of that surgical instrument by preventing activation or triggering an alert if the instrument is mounted.
[0077] In accordance with an embodiment, the method 300 comprising storing instrument-specific data recorded at the time of manufacturing at the centralized cloud database 212, thereby enabling traceability and post-operative verification of the surgical instrument in case of any suspected tampering, and where the instrument-specific data includes one or more of: a unique instrument identifier, an instrument type, and an authorized usage count of the surgical instrument. At the time of manufacturing, each surgical instrument is registered in the system with its metadata. The metadata is uploaded to the centralized cloud database 212 using secure communication protocols (e.g., TLS). Once stored, the metadata becomes a reference record accessible to authorized robotic systems and healthcare administrators. During surgical procedures, the local edge computing device 202 retrieves live data from the instrument's embedded NFC tag and compares it against the manufacturing record in the centralized cloud database 212. If discrepancies are detected, such as an unregistered ID or altered usage limits the system can block usage, raise alerts, or log the incident for review. In cases of legal or medical disputes, the stored manufacturing metadata acts as a trusted source of truth to verify instrument history and usage compliance.
[0078] FIG. 4 is a process of an Artificial Intelligence (AI)-based instrument validation and usage monitoring system for a robotic surgical system, in accordance with an embodiment of the present disclosure. FIG. 4 is described in conjunction with elements of FIGs. 1, 2, 3A, and 3B. With reference to FIG. 4, there is shown a process 400 followed by an AI-based instrument validation and usage monitoring sub-system (i.e., the sub-system 142), a part of the robotic surgical system 100 (of FIG. 1). The process 400 includes a series of operations 402 to 426.
[0079] At operation 402, the process 400 begins with the surgical instrument with a passive NFC tag. The NFC tag stores metadata associated with the surgical instrument including the surgical instrument’s unique identifier (ID), model number, usage count, sterilization history, and expiration date.
[0080] At operation 404, once the surgical instrument is mounted onto the robotic arm (i.e., one robotic arm of the three robotic arms 112), the NFC reader 210 reads the data encoded in the NFC tag. The NFC reader 210 retrieves the unique ID and usage count of the surgical instrument and forwards it to the processor 204 for verification.
[0081] At operation 406, the surgical instrument is physically held and manipulated by the robotic arm’s end effector. The robotic arm’s end effector component provides a mechanical interface and transmits actuation signals to control the surgical instrument's functions during surgical procedure.
[0082] At operation 408, the robotic arm (i.e., one robotic arm of the three robotic arms 112) receives control signals from the surgeon console 130 and performs precise movements using the attached surgical instrument.
[0083] At operation 410, the process 400 includes capturing a real-time or near real-time video of the surgical instrument at the surgical site using the endoscopic camera 138. In an exemplary implementation, the endoscopic camera 138 may be a telescope mounted on a dedicated robotic arm, such as the robotic arm 113 illustrated in FIG. 1. The endoscopic camera 138 is optically aligned with the surgical field and configured to capture high-resolution images of the surgical instruments inserted into the patient’s body and the surrounding anatomical environment during the surgical procedure. The captured optical data is transmitted to a camera processing unit, which processes the video signal and generates an output stream compliant with High-Definition Multimedia Interface (HDMI) standards. The HDMI output is further utilized for two purposes: (i) real-time visual display on a monitor to provide the surgeon with high-fidelity imagery of the surgical site, and (ii) AI-based analysis by a trained machine learning model (i.e., the trained AI model 208) configured to identify and validate the surgical instrument type based on the visual features captured in the video stream.
[0084] At operation 412, the HDMI output generated by the camera processing unit is transmitted to a video splitter processing board. In an exemplary embodiment, the video splitter processing board comprises a Field Programmable Gate Array (FPGA)-based architecture configured to process and duplicate the incoming video signal. The FPGA-based board is operatively configured to split the HDMI video signal into two parallel output streams. The first output stream is routed to the surgeon console 130 and displayed on a 3D visualization monitor, such as a 3D Full High Definition (FHD) monitor, to provide the surgeon with real-time stereoscopic visualization of the surgical field. The second output stream is converted and transmitted via a Universal Serial Bus (USB) interface to an AI processing module (i.e., the trained AI model 208) for real-time or near real-time analysis.
[0085] At operation 414, the 3D monitor on the surgeon console 130 receives the HDMI signal from the video splitter processing board and displays a high-resolution, real-time stereoscopic image of the surgical field.
[0086] At operation 416, the trained AI model 208 is configured to analyse the video generated by the endoscopic camera 138 and classify the type of the surgical instrument based on its visual features. The classification result is sent to the processor 204. In an implementation scenario, the processor 204 may be a part of an Industrial PC (IPC) with control software and the IPC may be used as the local edge computing device 202.
[0087] At operation 418, the processor 204 (i.e., the IPC) receives inputs from both the NFC reader 210 and the trained AI model 208, performs dual verification, and coordinates communication with the centralized cloud database 212. The processor 204 can inhibit the procedure or allow it to proceed based on verification results. If a mismatch occurs, the IPC with control software triggers a procedure inhibition and raises an alert, while also offering override capability through a secure surgeon interface.
[0088] At operation 420, a cloud computing infrastructure is configured to provide serverless computing and processing the surgical instrument validation requests from surgical facilities (for example, endoscopy, robotic surgery centres, orthopaedic etc). The cloud computing infrastructure performs background tasks, such as updating logs, generating alerts, and forwarding data to storage and databases.
[0089] At operation 422, an IoT based Core may be used as a secure cloud gateway in an exemplary implementation scenario, transmitting data from the local edge computing device 202 and the cloud computing infrastructure using protocols like MQTT over TLS. The IoT based Core ensures low-latency and encrypted communication.
[0090] At operation 424, the centralized cloud database 212 stores the metadata associated with the surgical instrument. The centralised structured database, such as instrument ID, usage count, validation stamps, and facility information are stored in the centralized cloud database 212. The metadata is stored in the centralized cloud database 212 for structured querying and long-term tracking of surgical instrument usage across multiple hospitals or robotic systems. The additional metadata, such as camera footage (for AI model training), historical logs, reports and audit trails may also be stored in the centralized cloud database 212 for compliance review and future AI model training.
[0091] At operation 426, a web application dashboard provides a user-friendly interface for surgeons, administrators, and compliance officers. The web application dashboard displays instrument validation results, alerts, usage summaries, and override history, enabling real-time monitoring and data-driven decision-making.
[0092] FIG. 5 is a block diagram illustrating a surgical instrument validation process within a robotic surgical system, in accordance with an embodiment of the present disclosure. FIG. 5 is described in conjunction with elements of FIGs. 1, 2, 3A, 3B, and 4. With reference to FIG. 5, there is shown a block diagram 500 illustrating the surgical instrument validation process within the robotic surgical system 100. The surgical instrument validation process includes a series of operation 502 to 520.
[0093] At operation 502, a passive NFC tag is embedded within the surgical instrument. The NFC tag stores metadata associated with the surgical instrument, including the instrument's unique ID, model number, usage count, sterilization history, and expiration date, forming the foundation of the tamper-proof tracking of the surgical instrument.
[0094] At operation 504, the NFC reader 210 reads the NFC tag. When the surgical instrument is positioned within the reader's range, the NFC reader 210 communicates via for example, a Serial Peripheral Interface (SPI) communication protocol, with the NFC tag to retrieve the encoded data from the NFC tag, meeting the sub-50ms latency requirement for real-time operation.
[0095] At operation 506, the SPI communication protocol facilitates high-speed digital data transfer between the NFC reader 210 and an embedded microcontroller that is the processor 204, enabling efficient reception of the surgical instrument metadata. The SPI interface ensures robust communication integrity and low transmission latency, essential for downstream verification processes.
[0096] At operation 508, the processor 204 receives metadata comprising the unique instrument identifier (ID), usage count, and additional relevant parameters such as expiration date, model number, and manufacturer ID. In an exemplary implementation, this metadata is processed by a firmware stack running on the processor 204, which integrates an EtherCAT slave IC for deterministic fieldbus communication. The processor 204 also initiates a first-stage validation to verify lifecycle constraints and expiration status. Simultaneously, the processor 204 executes the trained AI model 208 for second-stage verification, performing classification and anomaly detection based on real-time video feed.
[0097] At operation 510, the EtherCAT slave IC embedded with the processor 204 interfaces with the EtherCAT fieldbus system to prepare the metadata packet for transmission. The EtherCAT protocol ensures deterministic delivery of the metadata to the control system, maintaining timing synchronisation across distributed nodes of the robotic surgical system 100.
[0098] At operation 512, the EtherCAT slave IC is used to forward the validated metadata from the processor 204 to the EtherCAT master node hosted on a Single Board Computer (SBC). The transmission occurs over the EtherCAT fieldbus network, ensuring low-latency, real-time communication with guaranteed delivery order and redundancy mechanisms.
[0099] At operation 514, the SBC equipped with the EtherCAT master stack receives the metadata from the processor 204. The SBC serves as a coordination node for the local edge computing device 202 and further handles transmission of validated metadata to the centralized cloud database 212. The SBC utilizes a dedicated Ethernet port to establish a secure TLS-encrypted connection with the centralized cloud database 212, ensuring integrity and confidentiality of transmitted data. The Ethernet interface also enables real-time synchronization and integration with hospital-wide data systems.
[0100] At operation 516, the Ethernet port of the SBC provides a direct data path for secure bidirectional communication with the cloud backend. The ethernet port is also responsible for routing internal traffic between the local edge computing device 202 and the broader hospital network, thereby facilitating dynamic updates, firmware patches, and system diagnostics.
[0101] At operation 518, the Ethernet LAN communication is established between the SBC and the local edge computing device 202 (e.g., an Industrial PC), allowing seamless coordination of control software with AI-based instrument validation processes. The ethernet LAN interface supports multi-threaded data exchange, ensuring high-throughput transfer of video-based AI inference results and metadata validation signals in real-time.
[0102] At operation 520, the SBC interfaced with the local edge computing device 202 functions as the central command and coordination hub for the AI-based surgical instrument validation sub-system. The local edge computing device 202 governs the overall operation of the robotic surgical system 100, orchestrating it through four discrete operational states: idle, hand guiding, instrument exchange, and surgical. Throughout the operational states, the robotic surgical system 100 maintains continuous oversight of health parameters such as power integrity, device connectivity, and fault status. When the system transitions into surgical mode, the control software hosted on the local edge computing device 202 immediately triggers the trained AI model 208 to begin real-time validation. The trained AI model 208 processes input from both the NFC reader 210 and visual classification results from the camera processing system, ensuring the instrument in use is authentic, within its authorized usage range, and correctly identified.
[0103] FIG. 6 is a block diagram illustrating an endoscopic video processing within a robotic surgical system, in accordance with an embodiment of the present disclosure. FIG. 6 is described in conjunction with elements of FIGs. 1, 2, 3A, 3B, 4 and 5. With reference to FIG. 6, there is shown a block diagram 600 illustrating an endoscopic video processing within the robotic surgical system 100. The endoscopic video processing includes a series of operation 602 to 618.
[0104] At operation 602, the endoscopic camera 138 (for example, 3D full HD (FHD) endoscopic camera) equipped with a telescope captures real-time or near real-time images of the surgical site. The endoscopic camera 138 is configured to deliver stereoscopic video, ensuring accurate depth perception for both surgical control and instrument classification.
[0105] At operation 604, the video captured by the endoscopic camera 138 is transmitted to the camera processing unit (for example 3D Full High Definition (FHD) endoscopic camera processing unit), which processes the raw video stream into a standard high-definition video signal. The camera processing unit performs tasks such as image sharpening, format conversion, and possibly frame synchronization for dual-channel stereoscopic output.
[0106] At operation 606, the high-definition processed video signal is transmitted from the camera processing unit to a video splitter processing board via a HDMI cable. The HDMI cable ensures high-speed, lossless transmission of stereoscopic video signals between the camera processing unit and downstream processing modules.
[0107] At operation 608, the processed video signal is transmitted to a video splitter processing board (such as a Field Programmable Gate Array (FPGA)-based) via High-Definition Multimedia Interface (HDMI) cable. The video splitter processing board performs real-time video distribution and preprocessing. The processing board splits the 3D video into two parallel streams: a first video stream is directed to the monitor on the surgeon console 130 for real-time viewing by the surgeon and a second video stream is converted to a digital format (e.g., USB or CSI interface) and transmitted to the processor 204.
[0108] At operation 610, the first output video stream is displayed to the surgeon console 130 on the 3D monitor (such as 3D FHD Monitor), allowing for direct visualization of the surgical instrument as well as the surgical site. At operation 612, the second video stream is analysed by the trained AI model 208 to identify the type of the surgical instrument in real-time from the endoscopic video feed. The processor 204 performs dual verification by comparing the surgical instrument type identified via the trained AI model 208 with the Unique Instrument Identifier (UID) previously retrieved from the passive NFC tag embedded on the surgical instrument. If the identification results from the trained AI model 208 and the passive NFC tag metadata match, the surgical instrument is validated for surgical use. Otherwise, the sub-system 142 inhibits the continuation of the surgical procedure and raises an alert on the 3D monitor.
[0109] At operation 614, the validation result and relevant metadata are transmitted from the processor 204 to the local edge computing device 202 via an Ethernet Local Area Network (LAN) connection. The Ethernet LAN cable provides high-bandwidth, low-latency communication between the processing module and the system controller, enabling real-time synchronization of validation outcomes with the broader surgical control infrastructure.
[0110] At operation 616, the local edge computing device 202 implemented as the control software running on industrial PC, communicates with the processor 204 via Ethernet LAN connection to coordinate the AI-based validation process with the robotic surgical system 100. When the robotic surgical system 100 transits to the surgical mode, the local edge computing device 202 triggers the trained AI model 208 to initiates continuous, real-time validation of the surgical instrument. The trained AI model 208 processes the endoscopic video stream to identify the surgical instrument type, and the results are routed back to the processor 204 for verification. The validation outcomes are subsequently used for enforcing patient safety protocols, logging surgical instrument usage data, and maintaining compliance with predefined surgical limits as part of the integrated surgical instrument tracking and validation infrastructure.
[0111] Thus, the present disclosure provides the sub-system 142, offering enhanced surgical instrument validation, real-time monitoring, and comprehensive usage tracking capabilities. The present disclosure incorporates the local edge computing device 202 comprising the processor 204 operatively connected to the NFC reader 210 and the endoscopic camera 138, ensuring simultaneous data acquisition from the passive NFC tags and the real-time video streams while maintaining high-speed processing capabilities at the surgical site. Additionally, the dual verification mechanism, comprising the trained AI model 208 and the passive NFC tag reading functionality, enables precise instrument identification and validation by comparing the AI-identified instrument types with unique instrument identifiers retrieved from the passive NFC tags, ensuring accurate instrument classification and preventing unauthorized usage. The inclusion of the trained AI model 208, configured to classify surgical instruments based on tip geometry and articulation features, provides robust real-time instrument recognition capabilities, while the periodic re-training functionality using updated labelled imaging datasets ensures continuous improvement and adaptation to new surgical instrument types. Furthermore, the secure communication protocol comprising the TLS encryption and message queuing capabilities ensures protected metadata transmission to the centralized cloud database 212, enabling comprehensive usage tracking across multiple surgical systems and facilities while preventing instrument reuse beyond authorized limits. These advancements provide superior instrument validation accuracy, enhanced surgical safety, and comprehensive usage monitoring over conventional surgical systems, ensuring optimal instrument utilization, preventing counterfeit instrument usage, and maintaining regulatory compliance in robotic surgical environments.
[0112] Modifications to embodiments of the present disclosure described in the foregoing are possible without departing from the scope of the present disclosure as defined by the accompanying claims. Expressions such as "including", "comprising", "incorporating", "have", "is" used to describe and claim the present disclosure are intended to be construed in a non-exclusive manner, namely allowing for items, components or elements not explicitly described also to be present. Reference to the singular is also to be construed to relate to the plural. The word "exemplary" is used herein to mean "serving as an example, instance or illustration". Any embodiment described as “exemplary” is not necessarily to be construed as preferred or advantageous over other embodiments and/or to exclude the incorporation of features from other embodiments. The word "optionally" is used herein to mean "is provided in some embodiments and not provided in other embodiments". It is appreciated that certain features of the present disclosure, which are, for clarity, described in the context of separate embodiments, may also be provided in combination in a single embodiment. Conversely, various features of the present disclosure, which are, for brevity, described in the context of a single embodiment, may also be provided separately or in any suitable combination or as suitable in any other described embodiment of the disclosure.
, Claims:CLAIMS
We claim:
1. A robotic surgical system (100), comprising:
a patient-side cart (110) comprising a plurality of robotic arms, wherein at least one of the plurality of robotic arms is configured to hold a surgical instrument;
a surgeon console (130) configured to receive inputs from a surgeon to control the plurality of robotic arms;
a vision cart (120) configured to process and display images from a surgical site; and
a sub-system (142) configured to validate and monitor usage of the surgical instrument, the sub-system (142) comprising:
a local edge computing device (202) comprising a processor (204) operatively connected to:
a Near Field Communication, NFC, reader (210) configured to read a passive NFC tag embedded on the surgical instrument to retrieve a unique instrument identifier, ID and a usage count of the surgical instrument; and
an endoscopic camera (138) configured to capture a video stream of the surgical site in real-time or near real-time;
wherein the processor (204) is configured to:
execute a trained Artificial Intelligence, AI, model (208) to identify a type of the surgical instrument in real-time or near real-time from the endoscopic video stream;
perform a dual verification by comparing the type of the surgical instrument identified by the trained AI model (208) with the unique instrument identifier retrieved from the NFC tag;
transmit metadata associated with the surgical instrument to a centralized cloud database (212) over a secure communication protocol, wherein the centralized cloud database (212) is configured to maintain records of the surgical instrument usage across multiple surgical systems and facilities, and to prevent reuse of the surgical instrument beyond authorized limits; and
inhibit continuation of a surgical procedure and raise an alert if the unique instrument identifier and the AI-identified instrument type do not match, , or if the usage count of the surgical instrument exceeds the authorized limit, thereby enhancing patient safety by preventing use of an incorrect or unauthorized surgical instrument.
2. The robotic surgical system (100) as claimed in claim 1, wherein the trained AI model (208) is configured to classify the surgical instrument based on tip geometry and articulation features.
3. The robotic surgical system (100) as claimed in claim 1, wherein the processor (204) is configured to periodically re-train the trained AI model (208) using updated labelled imaging datasets of surgical instruments comprising graspers, scissors, needle holders, energy devices, staplers, and suction-irrigation tools.
4. The robotic surgical system (100) as claimed in claim 1, wherein the NFC tag is further configured to store additional metadata comprising at least one of: manufacturer identifier, sterilization history, model number of the surgical instrument, or expiration data.
5. The robotic surgical system (100) as claimed in claim 1, wherein the secure communication protocol comprises an encrypted connection established using Transport Layer Security, TLS and a message queuing protocol configured to transmit metadata to the centralized cloud database (212).
6. The robotic surgical system (100) as claimed in claim 1, wherein the processor (204) is further configured to locally store a temporary log of instrument verification events within the local edge computing device (202) prior to transmission to the centralized cloud database (212).
7. The robotic surgical system (100) as claimed in claim 1, wherein the centralized cloud database (212) is further configured to generate a usage report or trigger an alert when the surgical instrument approaches a predefined maximum usage threshold.
8. The robotic surgical system (100) as claimed in claim 1, wherein the centralized cloud database (212) is configured to store instrument-specific data recorded at the time of manufacturing, wherein the instrument-specific data includes one or more of: a unique instrument identifier, an instrument type and an authorized usage count, and thereby enabling traceability and post-operative verification of the surgical instrument in case of the system malfunction or suspected tampering with the sub-system.
9. The robotic surgical system (100) as claimed in claim 1, wherein the sub-system (142) is further configured to permit an override of the surgical procedure inhibition upon confirmation from the surgeon through a secured interface.
10. The robotic surgical system (100) as claimed in claim 1, wherein the endoscopic camera (138) is a stereoscopic camera configured to capture three-dimensional images of the surgical instrument and the surgical site.
11. The robotic surgical system (100) as claimed in claim 1, wherein the vision cart (120) is further configured to display a visual alert on a user interface when a mismatch is detected during the dual verification.
12. A method (300) for operating a robotic surgical system (100), comprising:
installing a surgical instrument onto at least one robotic arm of a robotic surgical system via a sterile adapter;
retrieving a unique identifier and a usage count associated with the surgical instrument by reading a Near Field Communication (NFC) tag embedded in the surgical instrument using an NFC reader mounted on the at least one robotic arm;
capturing and displaying images of a surgical site in real-time or near real-time using an endoscopic camera inserted into a patient’s body through a trocar;
identifying a type of the surgical instrument from the captured images using a trained Artificial Intelligence, AI, model (208);
performing a dual verification by comparing the identified type of the surgical instrument with the retrieved unique identifier;
inhibiting continuation of the surgical procedure and generating an alert if the comparison between the retrieved identifier and the AI-identified instrument type indicates a mismatch or if the usage count exceeds an authorized limit; and
enabling a surgeon to control the at least one robotic arm to manipulate the validated surgical instrument upon successful completion of the dual verification.
13. The method as claimed in claim 12, comprising:
transmitting metadata associated with the surgical instrument to a centralized cloud database (212) over a secure communication protocol;
maintaining records of the surgical instrument usage across multiple robotic systems and surgical facilities within the centralized cloud database (212); and
restricting reuse of the surgical instrument beyond an authorized usage limit based on the maintained records;
14. The method as claimed in claim 12, comprising storing instrument-specific data recorded at the time of manufacturing at the centralized cloud database (212), thereby enabling traceability and post-operative verification of the surgical instrument in case of any suspected tampering, and wherein the instrument-specific data includes one or more of: a unique instrument identifier, an instrument type, and an authorized usage count of the surgical instrument.
| # | Name | Date |
|---|---|---|
| 1 | 202521074993-STATEMENT OF UNDERTAKING (FORM 3) [06-08-2025(online)].pdf | 2025-08-06 |
| 2 | 202521074993-PROOF OF RIGHT [06-08-2025(online)].pdf | 2025-08-06 |
| 3 | 202521074993-POWER OF AUTHORITY [06-08-2025(online)].pdf | 2025-08-06 |
| 4 | 202521074993-FORM FOR SMALL ENTITY(FORM-28) [06-08-2025(online)].pdf | 2025-08-06 |
| 5 | 202521074993-FORM FOR SMALL ENTITY [06-08-2025(online)].pdf | 2025-08-06 |
| 6 | 202521074993-FORM 1 [06-08-2025(online)].pdf | 2025-08-06 |
| 7 | 202521074993-FIGURE OF ABSTRACT [06-08-2025(online)].pdf | 2025-08-06 |
| 8 | 202521074993-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [06-08-2025(online)].pdf | 2025-08-06 |
| 9 | 202521074993-EVIDENCE FOR REGISTRATION UNDER SSI [06-08-2025(online)].pdf | 2025-08-06 |
| 10 | 202521074993-DRAWINGS [06-08-2025(online)].pdf | 2025-08-06 |
| 11 | 202521074993-DECLARATION OF INVENTORSHIP (FORM 5) [06-08-2025(online)].pdf | 2025-08-06 |
| 12 | 202521074993-COMPLETE SPECIFICATION [06-08-2025(online)].pdf | 2025-08-06 |
| 13 | 202521074993-MSME CERTIFICATE [08-08-2025(online)].pdf | 2025-08-08 |
| 14 | 202521074993-FORM28 [08-08-2025(online)].pdf | 2025-08-08 |
| 15 | 202521074993-FORM-9 [08-08-2025(online)].pdf | 2025-08-08 |
| 16 | 202521074993-FORM 18A [08-08-2025(online)].pdf | 2025-08-08 |
| 17 | Abstract.jpg | 2025-08-16 |
| 18 | 202521074993-FER.pdf | 2025-11-20 |
| 1 | 202521074993_SearchStrategyNew_E_202521074993E_19-11-2025.pdf |