Sign In to Follow Application
View All Documents & Correspondence

A Navigation System For Tracking Skin Movement During Surgery And A Method Thereof

Abstract: A navigation system for tracking skin movement during surgery and a method thereof is disclosed. A first tracker (155) positioned on a vertebra (150). A second tracker (145) positioned around the spine at a distance from the retracted skin. A camera (135) captures video streams to detect the trackers. A processor (125) executes to receive the video streams and analyse using an image recognition model (220) to track a delta change in distance between the second trackers (145), and to track the delta change in distance between the second tracker (145) and the first tracker (155), display the video streams pertaining to the delta change in distance between the first tracker (155) and the second trackers (145), wherein the video streams comprises a displacement between the first tracker (155) and the second trackers (145), and displacement of the first tracker (145) is displayed with respect to the second trackers (155). FIG. 1

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
03 October 2025
Publication Number
44/2025
Publication Type
INA
Invention Field
BIO-MEDICAL ENGINEERING
Status
Email
Parent Application

Applicants

HAPPY RELIABLE SURGERIES PRIVATE LIMITED
14, 8TH CROSS RD, 1ST N BLOCK, RAJAJINAGAR, BENGALURU, KARNATAKA- 560010, INDIA

Inventors

1. ARPIT PALIWAL
HAPPY RELIABLE SURGERIES PRIVATE LIMITED, 14, 8TH CROSS RD, 1ST N BLOCK, RAJAJINAGAR, BENGALURU, KARNATAKA- 560010, INDIA

Specification

Description:FIELD OF INVENTION
[0001] Embodiments of the present disclosure relate to a field of navigation systems, surgery and more particularly to a navigation system for tracking skin movement during surgery and a method thereof.
BACKGROUND
[0002] In modern surgical practices, particularly those involving the spine, navigation systems have become integral tools. These systems utilize pre-operative imaging data, such as magnetic resonance imaging or computer tomography or scans, and intra-operative tracking technologies to guide surgical instruments with high precision. By overlaying the imaging data onto the patient’s anatomy, surgeons can operate with greater confidence and accuracy, reducing the risk of errors.
[0003] One of the major components of these systems involves the use of physical tracking markers, or reference trackers, which are attached to the patient’s body near the surgical site. These trackers communicate with external cameras and software systems to maintain spatial awareness of both the patient’s anatomy and the surgical instruments. However, the effectiveness of such systems is heavily dependent on the stability and positioning of these tracking markers throughout the surgical procedure.
[0004] In spine surgeries, where soft tissue retraction is often necessary to expose the vertebrae, physical displacement of the skin can lead to unintended movement of the attached trackers. Even small shifts in the tracker’s position can compromise the registration accuracy between the imaging data and the actual patient anatomy. Such discrepancies may result in navigation errors, potentially compromising the outcome of the surgery.
[0005] Moreover, since spine procedures often involve repeated adjustments of surgical exposure, there is a continuous risk of losing or corrupting the spatial data due to tracker displacement. Traditional systems that rely on a single reference tracker do not always offer adequate redundancy to detect or compensate for these shifts in real-time.
[0006] Hence, there is a need for an improved navigation system for tracking skin movement during surgery and a method thereof to address the aforementioned issue(s).
OBJECTIVES OF THE INVENTION
[0007] The primary objective of the invention is to provide a navigation system for tracking skin movement during spine surgery that maintains the accuracy of anatomical registration even when one or more tracking markers are displaced due to skin retraction or intraoperative movements.
[0008] Another objective of the invention is to implement a plurality of co-related trackers that can detect and compensate for displacement in real-time, ensuring that accurate navigation data is preserved throughout the surgical procedure.
[0009] Yet another objective of the invention is to enable the system to identify and calculate delta changes in the relative positions of the trackers, thereby allowing the navigation system to automatically adjust for any minor displacements without interrupting the surgery.
[0010] Yet another objective of the invention is to provide a method by which the system can distinguish between movement of the main tracker and that of the secondary trackers, using stored spatial relationships among the markers to detect and correct misalignment.
[0011] Yet another objective of the invention is to integrate a medical navigation unit capable of real-time imaging, registration, and display of displacement data, offering visual alerts when any tracker moves beyond a predefined threshold.
BRIEF DESCRIPTION
[0012] In accordance with an embodiment of the present disclosure, a navigation system for tracking skin movement during surgery is disclosed. The system includes a plurality of trackers adapted to be attached to a retracted skin at a surgery region of a spine during a surgery of a patient. The system includes a first tracker positioned at the surgery region on a vertebra of the spine. The system includes a plurality of second trackers positioned around the spine at a predetermined distance from the retracted skin at the surgery region. The system includes a camera adapted to capture one or more video streams during the surgery to detect the plurality of trackers. The system also includes a processor, and a memory coupled to the processor. The memory includes instructions that when executed by the processor, cause the processor to: receive the one or more video streams from the camera in real-time; analyse the one or more video streams using an image recognition model to track a delta change in distance between each of the plurality of the second trackers during the surgery in real time; analyse the one or more video streams using the image recognition model to track the delta change in distance between the plurality of secondary trackers and the first tracker during the surgery in real time display the one or more video streams pertaining to the delta change in distance between the first tracker and the plurality of second trackers, wherein the one or more video streams comprises a displacement between the first tracker and the plurality of second trackers based on the retracted skin movement, wherein the displacement of the main tracker is displayed with respect to the plurality of second trackers.
[0013] In accordance with an embodiment of the present disclosure, a method for operating a navigation system for tracking skin movement during surgery is disclosed. The method includes attaching, by a plurality of trackers, to a retracted skin at a surgery region of a spine during a surgery of a patient. The method includes positioning, by a first tracker, at the surgery region on a vertebra of the spine. The method includes positioning, by a plurality of second trackers, around the spine at a predetermined distance from the retracted skin at the surgery region. The method includes capturing, by a camera, one or more video streams during the surgery to detect the plurality of trackers pertaining to the patient. The method includes receiving, by a processor, the one or more video streams from the camera in real-time. The method includes analysing, by the processor, the one or more video streams using an image recognition model to track a delta change in distance between each of the plurality of the second trackers during the surgery in real time. The method includes analysing, by the processor, the one or more video streams using an image recognition model to track a delta change in distance between the plurality of the second trackers and the first tracker during the surgery in real time. The method includes displaying, by a display screen, the one or more video streams pertaining to the delta change in distance between the first tracker and the plurality of second trackers, wherein the one or more video streams comprises a displacement between the first tracker and the plurality of second trackers based on the retracted skin movement, wherein the displacement of the main tracker is displayed with respect to the plurality of second trackers.
[0014] To further clarify the advantages and features of the present disclosure, a more particular description of the disclosure will follow by reference to specific embodiments thereof, which are illustrated in the appended figures. It is to be appreciated that these figures depict only typical embodiments of the disclosure and are therefore not to be considered limiting in scope. The disclosure will be described and explained with additional specificity and detail with the appended figures.
BRIEF DESCRIPTION OF THE DRAWINGS
[0015] The disclosure will be described and explained with additional specificity and detail with the accompanying figures in which:
[0016] FIG. 1 illustrates a block diagram representation of an exemplary network architecture of a navigation system for tracking skin movement during surgery, in accordance with an embodiment of the present disclosure;
[0017] FIG. 2 illustrates a schematic diagram of a navigation system for tracking skin movement during surgery of FIG. 1, in accordance with an embodiment of the present disclosure; and
[0018] FIG. 3 is a flow chart representing the steps involved in a method for operating navigation system for tracking skin movement during surgery in accordance with an embodiment of the present disclosure.
[0019] Further, those skilled in the art will appreciate that elements in the figures are illustrated for simplicity and may not have necessarily been drawn to scale. Furthermore, in terms of the construction of the device, one or more components of the device may have been represented in the figures by conventional symbols, and the figures may show only those specific details that are pertinent to understanding the embodiments of the present disclosure so as not to obscure the figures with details that will be readily apparent to those skilled in the art having the benefit of the description herein.
DETAILED DESCRIPTION
[0020] For the purpose of promoting an understanding of the principles of the disclosure, reference will now be made to the embodiment illustrated in the figures and specific language will be used to describe them. It will nevertheless be understood that no limitation of the scope of the disclosure is thereby intended. Such alterations and further modifications in the illustrated system, and such further applications of the principles of the disclosure as would normally occur to those skilled in the art are to be construed as being within the scope of the present disclosure.
[0021] The terms “comprises”, “comprising”, or any other variations thereof, are intended to cover a non-exclusive inclusion, such that a process or method that comprises a list of steps does not include only those steps but may include other steps not expressly listed or inherent to such a process or method. Similarly, one or more devices or subsystems or elements or structures or components preceded by "comprises... a" does not, without more constraints, preclude the existence of other devices, sub-systems, elements, structures, components, additional devices, additional sub-systems, additional elements, additional structures or additional components. Appearances of the phrase "in an embodiment", "in another embodiment" and similar language throughout this specification may, but not necessarily do, all refer to the same embodiment.
[0022] Unless otherwise defined, all technical and scientific terms used herein have the same meaning as commonly understood by those skilled in the art to which this disclosure belongs. The system, methods, and examples provided herein are only illustrative and not intended to be limiting.
[0023] In the following specification and the claims, reference will be made to a number of terms, which shall be defined to have the following meanings. The singular forms “a”, “an”, and “the” include plural references unless the context clearly dictates otherwise.
[0024] Embodiments of the present disclosure relate to a navigation system for tracking skin movement during surgery is provided. The system includes a plurality of trackers adapted to be attached to a retracted skin at a surgery region of a spine during a surgery of a patient. The system includes a first tracker positioned at the surgery region on a vertebra of the spine. The system includes a plurality of second trackers positioned around the spine at a predetermined distance from the retracted skin at the surgery region. The system includes a camera adapted to capture one or more video streams during the surgery to detect the plurality of trackers. The system also includes a processor, and a memory coupled to the processor. The memory includes instructions that when executed by the processor, cause the processor to: receive the one or more video streams from the camera in real-time; analyse the one or more video streams using an image recognition model to track a delta change in distance between each of the plurality of the second trackers during the surgery in real time; analyse the one or more video streams using the image recognition model to track the delta change in distance between the plurality of secondary trackers and the first tracker during the surgery in real time display the one or more video streams pertaining to the delta change in distance between the first tracker and the plurality of second trackers, wherein the one or more video streams comprises a displacement between the first tracker and the plurality of second trackers based on the retracted skin movement, wherein the displacement of the main tracker is displayed with respect to the plurality of second trackers.
[0025] FIG. 1 illustrates a block diagram representation of an exemplary network architecture of a navigation system for tracking skin movement during surgery, in accordance with an embodiment of the present disclosure.
[0026] In one embodiment, a plurality of trackers adapted to be attached to a retracted skin at a surgery region (160) of a spine during a surgery of a patient. During spine surgeries, it is common practice to retract the skin and associated soft tissues to expose the vertebral column and the underlying anatomy for access and visibility. The act of retracting the skin causes potential movement or deformation of the tissue, which in turn may displace any marker or tracker positioned on the surface. The plurality of trackers are strategically attached to the retracted skin and the surgery region (160) to ensure positional data can be effectively captured and monitored throughout the surgical procedure.
[0027] In one embodiment, the plurality of trackers may be passive or active optical markers, electromagnetic sensors, or hybrid tracking elements that are recognized and monitored by the navigation system (100). Each of the plurality of trackers is uniquely identifiable and contributes to the establishment of a dynamic reference frame used by the image recognition system or camera unit during the surgery. The trackers are affixed in such a way that their positional relationship remains stable relative to the anatomical landmarks of interest, or when movement occurs, such movement can be computationally correlated using redundant spatial references provided by the additional trackers.
[0028] A first tracker (155) is positioned at the surgery region (160) on a vertebra (150) of the spine. The first tracker (155) serves as a primary or reference tracker and is placed on the bony surface of a vertebra (150) segment involved in the surgical procedure. The placement on the vertebra (150) ensures stability, as the vertebrae provide a rigid and anatomically consistent platform that is less susceptible to soft tissue shifts or movement caused by retraction, respiration, or other intraoperative factors.
[0029] The first tracker (155) may be integrated with a rigid fixation device such as a clamp, pin, or bone screw, which is mechanically fastened to the exposed vertebra following retraction of the overlying skin and soft tissues. This mechanical fixation ensures that the tracker maintains a fixed relationship with the spinal anatomy throughout the surgery and acts as a reliable base for real-time navigation and anatomical referencing. The position of the first tracker (155) is recognized and monitored by the navigation system (100) and used to correlate the physical spine with preoperative imaging, such as MRI or CT scans, enabling accurate intraoperative guidance.
[0030] An example of the first tracker (155) includes, but is not limited to, a reflective optical marker mounted on a surgical clamp attached to the spinous process, a radiolucent sensor embedded within a vertebral screw head or a miniaturized electromagnetic sensor affixed directly to the lamina or pedicle of the vertebra. The first trackers may be configured with unique spatial patterns or identification codes, allowing the navigation system (100) to differentiate them from other trackers in use during the procedure.
[0031] A plurality of second trackers (145) are positioned around the spine at a predetermined distance from the retracted skin at the surgery region (160). The plurality of second trackers (145), also referred to as auxiliary or secondary trackers, function in coordination with the first tracker to provide a stable reference framework for monitoring any displacement or movement that may occur during the surgical procedure. The plurality of second trackers (145) are specifically placed outside the region where the skin and soft tissues are retracted to minimize the effect of tissue shifts, deformation, or pressure-induced displacement during the course of surgery.
[0032] The predetermined distance at which the plurality of second trackers (145) are placed is selected based on clinical guidelines, spatial constraints within the surgical field, and the dimensions of the retracted area. The plurality of second trackers (145) are affixed to anatomical points that are not directly manipulated during surgery, such as the adjacent skin surface, nearby bony prominences, or fixed drapes or instrument holders, thereby ensuring their relative immobility throughout the procedure.
[0033] In another embodiment, a camera (135) is adapted to capture one or more video streams during the surgery to detect the plurality of trackers. The camera (135) functions as a core component of the navigation system and serves as an input device for continuously acquiring visual data in real time. The captured video streams represent the spatial arrangement and any changes in the position or orientation of the trackers throughout the duration of the surgery. The data gathered by the camera (135) enables the processor to analyse relative positional shifts and displacements that may occur due to soft tissue retraction, instrument manipulation, or patient movement.
[0034] The camera (135) is positioned to maintain an unobstructed and stable line of sight to the plurality of trackers, both the first tracker (155) and the plurality of second trackers (145). The field of view, resolution, frame rate, and depth-sensing capabilities of the camera (135) are selected to ensure accurate detection of small positional changes, typically in the sub-millimetre range, which are critical in high-precision procedures such as spine surgeries. The captured video feeds are processed.
[0035] In another embodiment, the navigation system (100) is configured to receive a magnetic resonance imaging report representing a three-dimensional anatomical structure of the spine as an input. The navigation system (100) interprets magnetic resonance imaging report to reconstruct a virtual 3D model of the spine, which is then used as a baseline for intraoperative comparison with real-time data obtained from the trackers and video streams. This allows for highly accurate correlation between the actual physical landmarks on the patient and the virtual model displayed on the system interface. Such correlation is crucial when evaluating movements or shifts in skin or bony anatomy during surgery.
[0036] The navigation system (100) serves as a central module that integrates imaging data with real-time intraoperative tracking to assist the surgeon in maintaining spatial orientation and accuracy during spinal procedures. The magnetic resonance imaging report provides a volumetric reference of the patient’s spinal anatomy, including vertebrae, intervertebral discs, nerve roots, and surrounding tissues, enabling precise registration and navigation during surgery.
[0037] In an embodiment, a cart is adapted to support the navigation system (100). The cart serves as a mobile and stable platform for housing and organizing essential components of the navigation system (100), including computing hardware, display interfaces, power modules, and connection ports. The mobility of the cart enables ease of positioning within the operation theatre and facilitates ergonomic accessibility for the surgical team.
[0038] In an embodiment, one or more surgical probes (140) adapted to assist in guiding a surgical instrument at a desired anatomical location of the spine during the surgery. The one or more surgical probes (140) function as real-time navigational aids that provide precise positional feedback. They may be tracked using optical, electromagnetic, or other sensor-based systems to determine their orientation and movement with respect to the patient’s spine. This assists the surgeon in executing accurate incisions or instrument insertions, minimizing tissue trauma and enhancing procedural outcomes.
[0039] An example of a one or more surgical probe (140) includes, but is not limited to, a tracked pointer, awl, cannulated pedicle probe, or access needle equipped with position sensors or reflective markers. The one or more probes (140) are often integrated into the navigation software so their trajectory can be visualized overlaid on the patient’s magnetic resonance imaging or computer tomography imaging data in real time.
[0040] In one embodiment, a display screen (115) is configured to display the one or more video streams pertaining to the delta change in distance (or displacement) between the first tracker (155) and the plurality of second trackers (145) based on the retracted skin movement. The video stream, captured by the camera (135) during surgery, is processed and rendered in real time on the display screen (115) to reflect the positional dynamics of the plurality of trackers. This includes both visual identification of the trackers and corresponding graphical or numerical overlays indicating displacement or movement with respect to a reference frame.
[0041] The display screen (115) specifically highlights displacement of the first tracker (115) which is positioned on the vertebra (150) at the surgery site relative to the stationary or minimally displaced plurality of second trackers (145), which are placed around the surgical region (160) at a distance from the retracted skin. The video stream thus serves not only as a visual feed but also conveys data-rich content on changes in spatial configuration, such as shifts due to skin retraction, muscular adjustment, or external forces during surgical procedures.
[0042] In another embodiment, the display screen (115) is adapted to display a visual alert upon detecting displacement between the first tracker (155) and the plurality of second trackers (145) beyond a pre-defined displacement threshold, wherein the pre-defined displacement threshold is based on the magnetic resonance imaging report. This threshold is determined based on the magnetic resonance imaging report, which represents the anatomical structure of the patient’s spine in a baseline, preoperative state. The display screen (115) acts as a user interface (275, FIG. 2) for visualizing alerts, navigation data, tracker movement.
[0043] It must be noted that the visual alert serves as a critical safety feature. Any displacement beyond the pre-defined threshold could indicate unintended movement of the skin, trackers, or patient, potentially compromising the accuracy of the navigation. By alerting the surgical team in real time, the navigation system (100) enables timely corrective actions, such as tracker re-registration or re-alignment, thereby ensuring consistent surgical precision throughout the procedure.
[0044]
[0045] It may be noted that the foregoing system is an exemplary system and may be implemented as computer executable instructions in any computing or processing environment, including in digital electronic circuitry or in computer hardware, firmware, device driver, or software. As such, the system is not limited to any specific hardware or software configuration.
[0001] FIG. 2 illustrates a schematic diagram of a navigation system for tracking skin movement during surgery of FIG. 1, in accordance with an embodiment of the present disclosure. The functions of various elements shown in the figs., including any functional blocks labelled as "processor(s)" (210), may be provided through the use of dedicated hardware as well as hardware capable of executing instructions. The processor (210) may be the same processor (110) of FIG. 1. When provided by a processor (210), the functions may be provided by a single dedicated processor, by a single shared processor, or by a plurality of individual processors, some of which may be shared. Moreover, explicit use of the term "processor" would not be construed to refer exclusively to hardware capable of executing instructions, and may implicitly comprise, without limitation, digital signal processor (DSP) hardware, network processor, application specific integrated circuit (ASIC), field programmable gate array (FPGA). Other hardware, standard and/or custom, may also be coupled to the processor(s).
[0002] The memory(s) (205) may be a computer-readable medium, examples of which comprise volatile memory (e.g., RAM), and/or non-volatile memory (e.g., Erasable Programmable read-only memory, i.e. EPROM, flash memory, etc.). The memory(s) may be an external memory, or internal memory, such as a flash drive, a compact disk drive, an external hard disk drive, or the like. The navigation system (100) may further include the user interface that may allow the connection or coupling of the navigation system (100) with one or more other devices, through a wired (e.g., Local Area Network, i.e., LAN) connection or through a wireless connection (e.g., Bluetooth®, Wi-Fi). The user interface (275) may also enable intercommunication between different logical as well as hardware components of the navigation system (100).
[0003] The navigation system (100) may include module(s). The module(s) may include a receiving module (215), an image recognition model (220), and a registration module (240). In one example, the module(s) may be implemented as a combination of hardware and firmware. In an example described herein, such combinations of hardware and firmware may be implemented in several different ways. For example, the firmware for module(s) may be processor (210) executable instructions stored on a non-transitory machine-readable storage medium and the hardware for the module(s) may include a processing resource (for example, implemented as either single processor or combination of multiple processors), to execute such instructions. Further, the hardware for the module(s) may include communication apparatuses, control circuitries involving electrical and electronics components, sensors, and interface devices, which may be in communication with each other for multi-directional communication therebetween.
[0046] In the present examples, the non-transitory machine-readable storage medium may store instructions that, when executed by the processing resource, implement the functionalities of modules(s). In such examples, the navigation system (100) may include the machine-readable storage medium storing the instructions and the processing resource to execute the instructions. In other examples of the present subject matter, the machine-readable storage medium may be located at a different location but accessible to the navigation system (100) and the processor(s) (210).
[0047] Further, the navigation system (100) may include data. The data may include one or more video streams (255), magnetic resonance imaging report (260), and positional relationship (265). It may be noted here that data may include data that is either received, stored, or generated as a result of functions implemented by the navigation system. It may be further noted that the data may be utilized by the modules of the navigation system for performing various functions of the navigation system.
[0048] The navigation system (100) may be provided with a database (280) to store physical locations of the patient, and one or more anatomical points. In an example implementation of the navigation system (100) including one or more servers, the databases may databases local to the server or may be remote to the server. It may be noted that the data in the databases may be stored as a table or may be pre-stored as a mapping with the other. This application is not limited thereto.
[0049] In operation, when the surgical procedure begins and the skin is retracted at the surgery region (160, FIG. 1), the surgical tracking navigation system (100) activates a set of hardware components including a plurality of trackers, the camera (135), and optionally, one or more surgical probes (140). Specifically, the plurality of second trackers (145, FIG. 1) are placed around the surgical site at a predetermined distance from the retracted skin, and the first tracker (155, FIG. 1) is positioned directly on a vertebra (150, FIG. 1) of the spine within the surgery region (160, FIG. 1). The plurality of trackers are configured to remain visible to the camera (135) during the procedure. The camera (135, FIG. 1) is adapted to capture one or more video streams of the surgical field in real-time, focusing on the spatial positions of both the first tracker (155) and the plurality of second trackers (145).
[0050] The visual data captured by the camera (135, FIG .1), which may include real-time tracker displacement due to soft tissue movement or changes in anatomical orientation, is transmitted to a receiving module (215) configured within the surgical tracking navigation system (100). It must be noted that the one or more video stream may include information such as the relative position of the first tracker (155, FIG. 1) with respect to each of the plurality of second trackers (145, FIG. 1), changes in angular alignment, or any inadvertent movement of the surgical field.
[0051] In one embodiment, the receiving module (215) is configured to receive the one or more video streams from the camera (135, FIG. 1) in real-time. The one or more video streams are generated by the camera (135, FIG. 1) upon detection of the plurality of trackers affixed to the patient’s body, specifically at the surgical region (160, FIG. 1) of the spine. The processor (210) is in continuous communication with the camera (135, FIG. 1), either through a wired or wireless communication interface, and retrieves the visual data frame-by-frame without significant delay to enable immediate computational processing.
[0052] The real-time reception of video streams allows the navigation system (100) to maintain accurate spatial awareness of the trackers during the ongoing surgical procedure. The processor (210) continuously buffers and processes incoming image frames, which contain positional information of the plurality of trackers, to be used for navigation and displacement detection. The streaming is handled through an optimized data protocol to ensure minimal latency and high frame fidelity, which are crucial for high-precision surgical interventions.
[0053] In one embodiment, the image recognition model (220) is configured to analyse the one or more video to track a delta change in distance between each of the plurality of the second trackers (145, FIG. 1) during the surgery in real time. The term “delta change” refers to a measurable shift in spatial relationship or relative positioning between any two or more second trackers affixed to the retracted skin at a defined distance from the primary surgical region. The plurality of second trackers (145, FIG. 1) are used as spatial reference markers, and the changes in distance among them reflect possible movement of skin or soft tissue during the surgical procedure.
[0054] The image recognition model (220) is a trained software algorithm that applies one or more computer vision techniques such as object detection, feature matching, geometric transformation mapping, and pattern recognition. The processor (210) applies the image recognition model (220) to identify and isolate the visual features of each of the plurality of second tracker (145, FIG. 1) in successive video frames. By comparing the geometric coordinates of the plurality of trackers across frames in real time, the image recognition model (220) computes the extent of movement (delta) between them.
[0055] An example of such image recognition analysis includes, but is not limited to the use of a convolutional neural network (CNN) trained to detect the visual markers attached to the patient’s skin; a model based on scale-invariant feature transform (SIFT) for robust recognition of plurality of tracker positions under varying lighting and angles; or an edge-based template matching model for identifying circular or QR-coded fiducial trackers in the video frames. The processor (210) may also employ frame-to-frame coordinate triangulation to maintain continuous mapping of the second trackers and to calculate instantaneous delta changes in millimetres.
[0056] In one embodiment, the image recognition model (220) is configured to analyse the one or more video streams to track the delta change in distance between the plurality of secondary trackers (145, FIG. 1) and the first tracker (155, FIG. 1) during the surgery in real time. This analysis is facilitated by the image recognition model (220), which is programmed to detect, isolate, and map the unique visual features of each of the plurality of trackers captured in the video stream. As the surgery progresses, any mechanical displacement whether due to involuntary tissue movement, patient repositioning, or surgical instrument manipulation can alter the relative position between the first tracker (155, FIG. 1) and plurality of secondary trackers (145, FIG. 1). The processor (210), through the model, identifies these changes by continuously analysing positional coordinates across sequential video frames.
[0057] An example of such an image recognition model (220) includes, but is not limited to a deep learning model trained on annotated surgical datasets to detect fiducial markers placed on the patient’s spine and surrounding skin; a geometric feature-matching algorithm that maps relative distances using triangulated 2D or 3D positions from multiple video frames; or a visual simultaneous localization and mapping (vSLAM) model for tracking spatial orientation and movement of the trackers in real time.
[0058] It must be noted that detecting the delta change in distance between the first tracker (155, FIG. 1) and the plurality of secondary trackers (145, FIG. 1) enables the navigation system (100) to determine the accuracy of the first tracker’s (155, FIG. 1) position during surgery. Since the first tracker (155, FIG. 1) serves as a primary anatomical reference point, any shift in its location, either intentional or accidental, must be identified promptly to ensure navigational accuracy. The relative stability or movement of the plurality of secondary trackers (145, FIG. 1) serves as a baseline to assess whether the first tracker has moved. This functionality ensures real-time correction of instrument guidance and enhances surgical safety and reliability, especially in complex spine procedures where millimetric accuracy is essentials.
[0059] In an embodiment, the registration module (240) is configured to perform a registration process for the patient by marking one or more anatomical landmarks on the magnetic resonance imaging report and matching one or more anatomical landmarks with corresponding physical locations on the patient, wherein the one or more anatomical landmarks comprises a point near the ears. The registration process involves the identification and alignment of one or more anatomical landmarks marked on a magnetic resonance imaging report with their corresponding physical locations on the patient’s body. This alignment is essential for establishing a spatial correlation between the preoperative imaging data and the intraoperative anatomy, thereby enabling accurate tracking and navigation during the surgical procedure.
[0060] The registration process begins by selecting key reference points visible both in the MRI scan and on the patient’s body. These anatomical landmarks are manually or semi-automatically marked on the MRI images and are then matched to the actual locations on the patient’s anatomy using a physical pointer, camera-based detection, or image-guided recognition system. In this context, the anatomical landmarks may serve as fiducial markers or reference points that define the coordinate transformation between the MRI model and the patient’s real-world positioning.
[0061] An example of an anatomical landmark includes, but is not limited to, a point near the ears (such as the external auditory meatus), the nasion, the inion, or the mastoid processes. The anatomical landmarks are typically chosen for their relative immobility and visibility in both imaging and physical examination, ensuring a reliable frame of reference during surgery. For spine-related procedures, additional landmarks may also include the spinous processes or iliac crests, depending on the surgical region.
[0062] In an embodiment, the image recognition module (220) is configured to detect the delta change in position between the first tracker (155, FIG. 1) and the plurality of second trackers (145, FIG. 1) by comparing real-time one or more video streams with previously stored positional relationships among the plurality of trackers. The image recognition module (220) is enabled to identify positional deviations. The stored positional relationships represent the baseline or reference configuration of the trackers, recorded either before surgery or at the beginning of the navigation procedure when the trackers are properly registered and calibrated.
[0063] An example of positional relationships includes, but is not limited to, Euclidean distances, angular orientations, and relative spatial vectors between the first tracker and each of the second trackers. The of positional relationships may be stored in a look-up table or matrix format within the system’s memory and serve as a benchmark for assessing subsequent movement.
[0064] In an embodiment, the image recognition model (220) is configured to recalculate the displaced position between the first tracker (155, FIG. 1) and the plurality of second trackers (145, FIG. 1) and orientation for an accurate navigation. This recalculation is performed upon detection of a delta change in positional relationships that may arise due to skin retraction, tracker displacement, or patient movement during spine surgery.
[0065] The recalculation process involves determining the current spatial coordinates and orientation of the first tracker (155, FIG. 1) relative to the fixed positions of the plurality of second trackers (145, FIG. 1), using real-time video data received from the camera (135, FIG. 1). The image recognition model (220) retrieves baseline reference coordinates of the plurality of trackers system, compares them with current coordinates derived from image recognition and positional estimation, and computes the new relative position and angular orientation required for realignment.
[0066] An example of displaced position and orientation recalculation includes, but is not limited to, calculating 3D coordinate shifts (X, Y, Z axes), angular deviations (pitch, yaw, roll), and transformation matrices that map the current configuration back to the original or reference configuration. These calculations ensure that any shifts in tracker position are compensated in the navigation interface and that the virtual overlay used for guidance reflects the real anatomical site accurately.
[0067] In an embodiment, the image recognition model (220) is configured to detect displacement of the first tracker (155, FIG. 1) based on comparison with the spatial relationships of the plurality of second trackers (145, FIG. 1), wherein the plurality of second trackers (145, FIG. 1) remains stationary and only the first tracker (155, FIG. 1) exhibits the delta change. Detect displacement of the first tracker (155, FIG. 1) by comparing the current spatial relationships among the plurality of trackers, wherein it is assumed that the plurality of second trackers (145, FIG. 1) remains fixed in position, and only the first tracker exhibits a positional variation, referred to as a delta change. This operation is particularly crucial for identifying and tracking minute changes in skin or anatomical structure displacement during spine surgery.
[0068] An example of displacement detection includes, but is not limited to, measuring a shift in the position vector of the first tracker beyond a pre-defined tolerance range, calculating deviations from a triangulated baseline formed by the plurality of second trackers (145, FIG. 1), and evaluating angular misalignment of the first tracker’s (155, FIG. 1) reference frame with respect to the frame defined by the plurality of second trackers (155, FIG. 1). This delta is used to assess how much the retracted skin or vertebra has shifted due to surgical manipulation or tissue elasticity.
[0069] Consider a non-limiting example wherein a surgeon, “Dr. Y,” begins a spinal surgery on patient “A” in a minimally invasive procedure requiring precision in navigation. Upon retraction of the skin at the surgical region, a plurality of second trackers (145, FIG. 1) are positioned circumferentially around the retracted skin at a predetermined distance, while a first tracker (155, FIG. 1) is securely attached to the exposed vertebra (150, FIG. 1) at the surgical region (160, FIG. 1). The camera (135, FIG. 1) is initialized and begins capturing real-time video streams of the surgical region, continuously detecting and monitoring the spatial relationship between the first tracker (155, FIG. 1) and the plurality of second trackers (145, FIG. 1). The navigation system (100) is loaded with the magnetic resonance imaging report scan of patient “A” showing a 3D representation of the spinal anatomy. As part of the registration process, the navigation system (100) highlights anatomical landmarks such as the mastoid processes near the ears, which are manually matched to corresponding physical landmarks on the patient.
[0070] The one or more video streams, containing visual data about the position and movement of the plurality of trackers, are received by the processor (210). The image recognition model (220) analyses the tracker positions to detect delta changes in distance between the trackers caused by subtle movements of the retracted skin or vertebrae during the surgery. When the delta changes between the first tracker (155, FIG. 1) and any of the plurality of second trackers (145, FIG. 1) exceeds a predefined displacement threshold which is dynamically referenced against the magnetic resonance imaging report scan the navigation system (100) automatically recalculates the corrected orientation and location of the first tracker for precise navigation. A visual alert is displayed on the display screen (115, FIG. 1), notifying Dr. Y of the deviation, ensuring that surgical tools can be realigned accurately.
[0071] Further, while the surgery is ongoing, a live feed of the surgical region (160, FIG. 1) with superimposed graphical overlays showing displacement vectors and orientation arrows is displayed on the screen for the surgical team’s situational awareness. Dr. Y queries the system verbally, asking, “Has the vertebra shifted by more than 3 mm from baseline?” The processor interprets the one or more video streams, processes the latest image data, and responds with an on-screen message: “Yes, the first tracker has shifted by 3.4 mm posteriorly relative to second trackers.” In parallel, the navigation system (100) logs this event along with timestamp, tracker IDs, MRI reference data, and environmental notes (e.g., patient position, retractor tension) into the patient's intraoperative record for post-op analysis. Such intelligent interaction ensures surgical accuracy while maintaining a complete digital audit of procedural dynamics.
[0072] FIG. 3 is a flow chart representing the steps involved in a method for operating navigation system for tracking skin movement during surgery in accordance with an embodiment of the present disclosure. The method (300) attaching, by a plurality of trackers, to a retracted skin at a surgery region of a spine during a surgery of a patient in step 305. This step is critical for establishing a reliable and spatially aware tracking configuration that can monitor any positional deviation caused due to dynamic factors during the surgery, such as skin retraction, surgical manipulation, or mechanical forces.
[0073] Each of the plurality of trackers is secured externally to the retracted skin with the help of biocompatible adhesive interfaces, surgical-grade fasteners, or other medically approved attachment mechanisms. The trackers are positioned in such a way that they collectively form a spatial reference frame around the surgical site. These trackers are individually distinguishable and are used to create a network of positional anchors that allow the system to calculate distances and displacements in three-dimensional space during the surgery.
[0074] The method (300) includes positioning, by a first tracker, at the surgery region on a vertebra of the spine in step 310. This first tracker serves as the primary or reference tracker, forming a central anchor for navigation-based computations during the surgical procedure. The placement of the first tracker on the vertebra ensures direct alignment with the bony anatomy, thus minimizing relative motion and enhancing spatial accuracy when registering anatomical structures during the surgery.
[0075] The first tracker is securely affixed to the vertebral bone using a medically approved fixation mechanism, such as a bone screw, a clamp, or a pin. This tracker is designed to withstand motion artifacts and is chosen for its rigid fixation capability to ensure that it moves consistently with the spine and not with the surrounding soft tissues. The secure placement of the first tracker allows the system to correlate intraoperative tracker movement with pre-operative imaging data, such as MRI or CT scans, thus enabling precise navigation.
[0076] The method (300) includes positioning, by a plurality of second trackers, around the spine at a predetermined distance from the retracted skin at the surgery region in step 315. The plurality of second trackers are not attached directly to bone or internal anatomical structures but rather are secured to stable regions in proximity to the retracted skin. The predetermined distance is selected based on clinical requirements and surgical planning parameters to ensure these second trackers remain stationary throughout the surgical procedure, unaffected by soft tissue displacement, tool interference, or changes in patient posture.
[0077] The method (300) includes capturing, by a camera, one or more video streams during the surgery to detect the plurality of trackers pertaining to the patient in step 320. The camera is positioned in such a way that it has an unobstructed line of sight to all the trackers including the first tracker affixed to the vertebra at the surgery region and the plurality of second trackers placed around the spine. The camera operates in real-time and captures continuous visual data throughout the surgical procedure, ensuring that any movement of the trackers due to skin retraction or other surgical manipulations is accurately recorded.
[0078] The method (300) includes receiving, by a processor, the one or more video streams from the camera in real-time in step 325. Once the camera captures the live video streams reflecting the spatial positioning and potential movement of the plurality of trackers affixed to the patient’s surgical region, these video streams are transmitted to the processor without delay for further computational processing.
[0079] The method (300) includes analysing, by the processor, the one or more video streams using an image recognition model to track a delta change in distance between each of the plurality of the second trackers during the surgery in real time in step 330. The image recognition model comprises a trained algorithm designed to detect, distinguish, and measure the positions of each individual second tracker relative to the others at various time intervals during the surgical procedure.
[0080] The method (300) includes analysing, by the processor, the one or more video streams using an image recognition model to track a delta change in distance between the plurality of the second trackers and the first tracker during the surgery in real time in step 330. The first tracker, which is fixed to a vertebra at the surgical site, may experience displacement due to retraction or manipulation of the skin or due to direct surgical interference. By evaluating the first tracker in reference to the more stable plurality of second trackers, the system ensures continuous awareness of the primary reference frames.
[0081] The method (300) includes displaying, by a display screen, the one or more video streams pertaining to the delta change in distance between the first tracker and the plurality of second trackers, wherein the one or more video streams comprises a displacement between the first tracker and the plurality of second trackers based on the retracted skin movement, wherein the displacement of the main tracker is displayed with respect to the plurality of second trackers in step 340. The one or more video streams are processed in real time and overlaid with computed displacement values derived from the movement of the retracted skin and the vertebral landmark to which the first tracker is affixed. The display allows a surgeon to visually monitor and assess whether any unexpected displacement of the main tracker has occurred due to surgical manipulation or anatomical shift.
[0082] The displacement, as depicted on the display, represents a spatial deviation of the first tracker relative to the stable frame of reference formed by the plurality of second trackers. By comparing these positions, the display effectively communicates whether the surgical site has shifted from its calibrated position, enabling corrective measures to be taken promptly.
[0083] Various embodiments of the navigation system for tracking skin movement during surgery and the method thereof as described above provide numerous advantages. One primary advantage is the use of the first tracker positioned directly on the vertebra at the surgical site, enabling precise real-time localization of the operative anatomy. The plurality of second trackers, positioned around the spine at the predetermined distance from the retracted skin, act as stable reference points, allowing for consistent and accurate detection of spatial displacement. The integration of a camera to capture real-time video streams provides a non-intrusive and continuous method to monitor the positions of all trackers during surgery. Furthermore, the incorporation of a navigation system that receives a magnetic resonance imaging (MRI) report allows for advanced pre-operative planning and accurate anatomical referencing through landmark registration. The display screen, which visualizes the displacement of the first tracker relative to the second trackers, provides immediate feedback to the surgical team, enhancing intraoperative awareness and reducing the risk of navigation errors. Additionally, the cart supporting the navigation system ensures the system is mobile and conveniently accessible within the operating room. Finally, the use of one or more surgical probes facilitates guided navigation of instruments to the intended anatomical location with high confidence, and the system’s ability to issue visual alerts upon detecting displacement beyond a predefined threshold ensures safety and responsiveness to anatomical shifts caused by soft tissue movement or surgical manipulation. Collectively, these components work in harmony to deliver a robust, intelligent, and adaptable surgical navigation solution.
[0084] The techniques described in this disclosure may be implemented, at least in part, in hardware, software, firmware, or any combination thereof. For example, various aspects of the described techniques may be implemented within one or more processors, including one or more microprocessors, digital signal processors (DSPs), application-specific integrated circuits (ASICs), field-programmable gate arrays (FPGAs), or any other equivalent integrated or discrete logic circuitry, as well as any combinations of such components. The term “processor” or “processing subsystem” may generally refer to any of the foregoing logic circuitry, alone or in combination with other logic circuitry, or any other equivalent circuitry. A control unit including hardware may also perform one or more of the techniques of this disclosure.
[0085] Such hardware, software, and firmware may be implemented within the same device or within separate devices to support the various techniques described in this disclosure. In addition, any of the described units, modules, or components may be implemented together or separately as discrete but interoperable logic devices. Depiction of different features as modules or units is intended to highlight different functional aspects and does not necessarily imply that such modules or units must be realized by separate hardware, firmware, or software components. Rather, functionality associated with one or more modules or units may be performed by separate hardware, firmware, or software components, or integrated within common or separate hardware, firmware, or software components.
[0086] It will be understood by those skilled in the art that the foregoing general description and the following detailed description are exemplary and explanatory of the disclosure and are not intended to be restrictive thereof.
[0087] While specific language has been used to describe the disclosure, any limitations arising on account of the same are not intended. As would be apparent to a person skilled in the art, various working modifications may be made to the method in order to implement the inventive concept as taught herein.
[0088] The figures and the foregoing description give examples of embodiments. Those skilled in the art will appreciate that one or more of the described elements may well be combined into a single functional element. Alternatively, certain elements may be split into multiple functional elements. Elements from one embodiment may be added to another embodiment. For example, the order of processes described herein may be changed and are not limited to the manner described herein. Moreover, the actions of any flow diagram need not be implemented in the order shown; nor do all of the acts need to be necessarily performed. Also, those acts that are not dependent on other acts may be performed in parallel with the other acts. The scope of embodiments is by no means limited by these specific examples.

, Claims:WE CLAIM:

1. A navigation system (100) for tracking skin movement during surgery, comprising:
a plurality of trackers adapted to be attached to a retracted skin at a surgery region (160) of a spine during a surgery of a patient, wherein the plurality of trackers comprises:
a first tracker (155) positioned at the surgery region (160) on a vertebra (150) of the spine; and
a plurality of second trackers (145) positioned around the spine at a predetermined distance from the retracted skin at the surgery region (160);
a camera (135) adapted to capture one or more video streams during the surgery to detect the plurality of trackers;
a processor (110);
a memory (120) coupled to the processor (110), wherein the memory (120) comprises instructions that, when executed by the processor (110), cause the processor (120) to:
receive the one or more video streams from the camera (135) in real-time;
analyse the one or more video streams using an image recognition model (220) to track a delta change in distance between each of the plurality of the second trackers (145) during the surgery in real time;
analyse the one or more video streams using the image recognition model (220) to track the delta change in distance between the plurality of secondary trackers (145) and the first tracker (155) during the surgery in real time;
display the one or more video streams pertaining to the delta change in distance between the first tracker (155) and the plurality of second trackers (145),
wherein the one or more video streams comprises a displacement between the first tracker (155) and the plurality of second trackers (145) based on the retracted skin movement, and
wherein the displacement of the first tracker (155) is displayed with respect to the plurality of second trackers (145).

2. The navigation system (100) as claimed in claim 1, to cause the processor (110) to:
receive a magnetic resonance imaging report representing a three-dimensional anatomical structure of the spine as an input.

3. The navigation system (100) as claimed in claim 1, to cause the processor (110) to:
perform a registration process for the patient by marking one or more anatomical landmarks on the magnetic resonance imaging report and matching one or more anatomical landmarks with corresponding physical locations on the patient, wherein the one or more anatomical landmarks comprises a point near the ears.

4. The navigation system (100) as claimed in claim 1, comprising:
a cart adapted to support the navigation system (100);
one or more surgical probes (140) adapted to assist in guiding a surgical instrument at a desired anatomical location of the spine during the surgery; and
a display screen (115) is adapted to display a visual alert upon detecting displacement between the first tracker (155) and the plurality of second trackers (145) beyond a pre-defined displacement threshold, wherein the pre-defined displacement threshold is based on the magnetic resonance imaging report.

5. The navigation system (100) as claimed in claim 1, to cause the processor (110) to:
detect the delta change in position between the first tracker (155) and the plurality of second trackers (145) by comparing real-time one or more video streams with previously stored positional relationships among the plurality of trackers, and
recalculate the displaced position between the first tracker (155) and the plurality of second trackers (145) and orientation for an accurate navigation.

6. The navigation system (100) as claimed in claim 1, to cause the processor (110) to:
detect displacement of the first tracker based on comparison with the spatial relationships of the plurality of second trackers (145), wherein the plurality of second trackers (145) remains stationary and only the first tracker (155) exhibits the delta change.

7. A method (300) for operating the navigation system for tracking skin movement during the spine surgery, comprising:
attaching, by a plurality of trackers, to a retracted skin at a surgery region of a spine during a surgery of a patient; (305)
positioning, by a first tracker, at the surgery region on a vertebra of the spine; (310)
positioning, by a plurality of second trackers, around the spine at a predetermined distance from the retracted skin at the surgery region; (315)
capturing, by a camera, one or more video streams during the surgery to detect the plurality of trackers pertaining to the patient; (320)
receiving, by a processor, the one or more video streams from the camera in real-time; (325)
analysing, by the processor, the one or more video streams using an image recognition model to track a delta change in distance between each of the plurality of the second trackers during the surgery in real time; (330)
analysing, by the processor, the one or more video streams using the image recognition model to track the delta change in distance between the plurality of secondary trackers and the first tracker during the surgery in real time; (335)
displaying, by a display screen, the one or more video streams pertaining to the delta change in distance between the first tracker and the plurality of second trackers, wherein the one or more video streams comprises a displacement between the first tracker and the plurality of second trackers based on the retracted skin movement, wherein the displacement of the main tracker is displayed with respect to the plurality of second trackers; (340)

Dated this 03rd Day of October 2025
Signature

Manish Kumar
Patent Agent (IN/PA-5059)
Agent for the Applicant

Documents

Application Documents

# Name Date
1 202541095266-STATEMENT OF UNDERTAKING (FORM 3) [03-10-2025(online)].pdf 2025-10-03
2 202541095266-REQUEST FOR EARLY PUBLICATION(FORM-9) [03-10-2025(online)].pdf 2025-10-03
3 202541095266-PROOF OF RIGHT [03-10-2025(online)].pdf 2025-10-03
4 202541095266-POWER OF AUTHORITY [03-10-2025(online)].pdf 2025-10-03
5 202541095266-FORM-9 [03-10-2025(online)].pdf 2025-10-03
6 202541095266-FORM FOR SMALL ENTITY(FORM-28) [03-10-2025(online)].pdf 2025-10-03
7 202541095266-FORM FOR SMALL ENTITY [03-10-2025(online)].pdf 2025-10-03
8 202541095266-FORM 1 [03-10-2025(online)].pdf 2025-10-03
9 202541095266-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [03-10-2025(online)].pdf 2025-10-03
10 202541095266-EVIDENCE FOR REGISTRATION UNDER SSI [03-10-2025(online)].pdf 2025-10-03
11 202541095266-DRAWINGS [03-10-2025(online)].pdf 2025-10-03
12 202541095266-DECLARATION OF INVENTORSHIP (FORM 5) [03-10-2025(online)].pdf 2025-10-03
13 202541095266-COMPLETE SPECIFICATION [03-10-2025(online)].pdf 2025-10-03
14 202541095266-MSME CERTIFICATE [06-10-2025(online)].pdf 2025-10-06
15 202541095266-FORM28 [06-10-2025(online)].pdf 2025-10-06
16 202541095266-FORM-8 [06-10-2025(online)].pdf 2025-10-06
17 202541095266-FORM 18A [06-10-2025(online)].pdf 2025-10-06