Abstract: A system (100) for performing ultrasound imaging. The system includes a robotic unit (102) and a remote-control station (104). The robotic unit includes platform (608), a column (606) disposed on the platform, adapted to slide along a length of the platform. A cantilever beam (604) extends from a side of the column and is adapted to move along a length of the column and relative to the platform. The robotic unit includes a three-link manipulator (602). The three-link manipulator (602) is configured to allow movement of a robotic ultrasound probe (618) along the body surface of a subject. The system further includes a remote-control station (108). The remote-control station includes a handheld probe unit 116 and a handheld probe unit (118). The remote-control station is configured to enable a user to control one or more operations of the robotic unit from a remote location. <>
DESC:FIELD OF TECHNOLOGY
[001] The present disclosure generally relates to the field of medical instruments and more particularly to a system for performing ultrasound imaging.
BACKGROUND
[002] Ultrasound imaging is widely utilized and plays a crucial role in early detection and monitoring of medical conditions. Ultrasound machines provide real-time imaging of internal structures, including soft tissues, the heart, liver, and other organs, as well as fetal development during pregnancy. The portability and relatively low cost make the ultrasound machines particularly suitable for point-of-care and monitoring.
[003] Ultrasound imaging is free from ionizing radiation, making ultrasound imaging a safer alternative compared to other imaging methods. This modality is particularly critical in pregnancy, aiding in fetal health assessment, detecting abnormalities, and supporting maternal care. Additionally, ultrasound imaging plays a key role in emergencies by enabling rapid detection of life-threatening conditions such as internal bleeding or abdominal abnormalities.
[004] Conventional ultrasound systems rely on mechanical configurations involving motors for translational, tilting, rocking, and rotational motions. However, the conventional ultrasound systems face significant limitations in ultrasound probe maneuverability. For example, conventional ultrasound systems cannot accommodate ultrasound probe positioning below the bed plane or at sharp angles at the bed edges, thereby restricting the ultrasound probe inability to produce or radiate ultrasound waves to an object or area of interest. Systems with soft-end effectors also fail to provide the freedom needed for precise probe manipulation, limiting tilting and rocking angles to less than 90 degrees.
[005] Existing remote ultrasound solutions, such as MGIUS-R3 and Melody AdEchoTech, partially address these challenges but introduce limitations. While the MGIUS-R3 robotic ultrasound machine enables remote control of the ultrasound probe, the MGIUS-R3 robotic ultrasound machine cannot cover all critical areas of a subject’s body, particularly the abdomen. The subject indicates an individual or patient undergoing ultrasound imaging or diagnostic procedures. Similarly, Melody AdEchoTech robotic ultrasound machine allows roll, pitch, and yaw movements and offers remote application of pressure on the subject’s abdomen.
[006] However, the Melody AdEchoTech robotic ultrasound machine still requires skilled personnel at the subject’s location to position the ultrasound probe initially, reducing independence and scalability for widespread use. The dependence on the skilled personnel physically present with the subject limits the accessibility of ultrasound imaging, especially in emergencies or in remote locations. This necessitates the subject travel to diagnostic centers, further delaying medical intervention and increasing the risk of mortality in critical cases. The lack of autonomous solutions capable of fully remote operation not only hinders timely diagnostics but also underscores the urgent need for innovations that bridge this gap in healthcare delivery.
SUMMARY
[007] This summary is provided to introduce a selection of concepts, in a simplified format, that are further described in the detailed description of the invention. This summary is neither intended to identify key or essential inventive concepts of the invention nor is it intended for determining the scope of the invention.
[008] According to an embodiment of the present disclosure, a system for performing ultrasound imaging. The system includes a robotic unit. The robotic unit includes a platform, a column disposed on the platform, adapted to slide along a length of the platform. Further, the robotic unit includes a cantilever beam extending from a side of the column and adapted to move along a length of the column and relative to the platform. Furthermore, the robotic unit includes a three-link manipulator slidably mounted to a distal end of the cantilever beam via a support member. The three-link manipulator includes a first link having a proximal end rotatably coupled to the support member and a distal end formed at a predefined angle with respect to the proximal end of the first link. Further, the three-link manipulator includes a second link having a proximal end rotatably coupled to the distal end of the first link and a distal end formed at the predefined angle with respect to the proximal end of the second link. In addition, the three-link manipulator includes a third link having a proximal end rotatably coupled to the distal end of the second link and a distal end opposite to the proximal end of the third link and having a mount, configured to allow movement of a robotic ultrasound probe along the body surface of a subject. The third link comprises at least one force sensor configured to sense an amount of force being applied to the body surface of the subject and an excess force absorber configured to absorb and dissipate force exceeding a predefined threshold value. The system includes a remote-control station configured to enable a user to control one or more operations of the robotic unit from a remote location. The remote-control station includes a handheld probe unit. The handheld probe unit includes one or more sensors to detect one or more spatial parameters based on hand movement of the user. The handheld probe unit is configured to transmit one or more spatial parameters to the robotic gantry unit for controlling movement of the robotic ultrasound probe along the body surface of the subject. The remote-control station includes a curved haptic pad positioned in conjunction with the handheld probe unit. The curved haptic pad is configured to allow the user to apply pressure beyond a flat surface of the curved haptic pad and provide pressure to the user based on one or more feedback signals indicating physical interactions experienced by the robotic ultrasound probe on the body surface of the subject. The curved haptic pad is configured to receive one or more feedback signals from the robotic gantry unit via a computing unit.
[009] According to another embodiment of the present disclosure, a handheld probe unit coupled to a system for performing ultrasound imaging. The handheld probe unit includes at least one sensor adapted to detect one or more spatial parameters associated with a hand movement of a user holding the probe unit. Further, the handheld probe unit includes a communication unit adapted to transmit one or more spatial parameters to a robotic unit for controlling movement of a robotic ultrasound probe along the body surface of the subject. A curved haptic pad is positioned in conjunction with the handheld probe unit. The curved haptic pad is configured to allow the user to apply the pressure beyond a flat surface of the curved haptic pad and provide pressure to the user based on one or more feedback signals indicating physical interactions experienced by the robotic ultrasound probe on the body surface of the subject. The curved haptic pad is configured to receive one or more feedback signals from the robotic unit.
[0010] According to yet another embodiment of the present disclosure, a three-link manipulator is operably coupled to a system for performing ultrasound imaging. The three-link manipulator includes a three-link manipulator slidably mounted to a distal end of a cantilever beam via a support member. The three-link manipulator includes a first link having a proximal end rotatably coupled to the support member and a distal end formed at a predefined angle with respect to the proximal end of the first link. The three-link manipulator includes a second link having a proximal end rotatably coupled to the distal end of the first link and a distal end formed at the predefined angle with respect to the proximal end of the second link. The three-link manipulator includes a third link having a proximal end rotatably coupled to the distal end of the second link and a distal end opposite to the proximal end of the third link and having a mount, configured to allow movement of a robotic ultrasound probe along the body surface of a subject. The third link includes at least one force sensor configured to sense an amount of force being applied to the body surface of the subject and an excess force absorber configured to absorb and dissipate force exceeding a predefined threshold value.
[0011] Advantages of the present invention:
The above-mentioned system of the present disclosure provides the followings:
1. Masking of the genital organs of the fetus using artificial intelligence, hence restricting the chance of pre-natal sex determination.
2. Robotic arm and ultrasound system at the subject will have GPS installed through which all the machines can be tracked ensuring stronger implementation of PCP-NDT.
3. The present invention includes the system that enables remote ultrasound imaging, reducing the need for skilled personnel to be physically present at the subject’s location.
4. The present invention provides real-time, high-resolution imaging of internal structures, ensuring accurate diagnostics even in remote or underserved regions.
5. The present invention incorporates advanced robotics for precise probe positioning, overcoming limitations of traditional systems in probe manoeuvrability.
6. The present invention utilizes an excess force absorber to prevent damage to the subject and the system during operation, ensuring safety and reliability.
7. The present invention supports multiple degrees of freedom, allowing the robotic ultrasonic probe to access complex anatomical surfaces, including areas below the bed plane and sharp angles.
8. The present invention provides features of autonomous calibration and failure position configuration, reducing dependency on manual intervention and enhancing system robustness.
9. The present invention integrates blockchain technology for secure storage and management of medical records, ensuring data integrity and accessibility.
10. The present invention offers a modular design for the three-link manipulator, providing adaptability and flexibility in diverse diagnostic scenarios.
11. The present invention includes haptic feedback mechanisms for enhanced operator control, replicating tactile sensations remotely.
12. The present invention includes proximity and pressure sensors to optimize probe movement and ensure patient comfort during procedures.
13. The present invention reduces the need for patient travel to diagnostic centres, facilitating timely medical intervention and improving healthcare outcomes.
14. The present invention incorporates energy-efficient components, making it suitable for use in areas with inconsistent power supply.
15. The present invention addresses maternal and infant mortality rates by enabling accessible diagnostic solutions for critical health conditions.
16. The present invention provides compatibility with environmental analyzers and audio/video feeds, creating a comprehensive diagnostic ecosystem.
17. The present invention facilitates scalability and independence in healthcare delivery, supporting the widespread adoption of ultrasound technology.
18. The present invention includes the three-link manipulator capable of translational motion along a specially designed cantilever beam. The beam is structured to ensure patients do not feel claustrophobic while providing sufficient span to cover the entire sagittal axis of the patient. This cantilever beam maximizes both patient comfort and operational efficiency.
19. The present invention includes an emergency switch that allows immediate manual intervention to halt all operations in critical situations.
20. In the event of power or internet disconnection, the robotic unit automatically returns to a safe, predefined position, preventing damage to the patient or the system.
21. The present invention ensures effective remote collaboration between the user and the subject.
[0012] To further clarify the advantages and features of the present subject matter, a more particular description of the invention will be rendered by reference to specific embodiments thereof, which are illustrated in the appended drawings. It is appreciated that these drawings depict only typical embodiments of the invention and are therefore not to be considered limiting of its scope. The invention will be described and explained with additional specificity and detail in the accompanying drawings.
BRIEF DESCRIPTION OF THE FIGURES
[0013] These and other features, aspects, and advantages of the exemplary embodiments can be better understood when the following detailed description is read with reference to the accompanying drawings in which such as characters represent such as parts throughout the drawings, wherein:
[0014] Figure 1 illustrates a block diagram depicting a system for performing ultrasound imaging, in accordance with an embodiment of the present disclosure;
[0015] Figure 2 illustrates a block diagram of a computing unit for performing ultrasound imaging, in accordance with an embodiment of the present disclosure;
[0016] Figure 3 illustrates a schematic representation of a remote-control station, in accordance with an embodiment of the present disclosure;
[0017] Figure 4 illustrates a schematic representation of a handheld probe unit adapted to couple to the system for performing the ultrasound imaging, in accordance with an embodiment of the present disclosure;
[0018] Figure 5 illustrates a schematic representation of a curved haptic pad, in accordance with an embodiment of the present disclosure;
[0019] Figure 6A shows a side view of a robotic unit, in accordance with an embodiment of the present disclosure;
[0020] Figure 6B shows a front view of the robotic unit, in accordance with an embodiment of the present disclosure;
[0021] Figure 7 illustrates a schematic representation of a three-link manipulator, in accordance with an embodiment of the present disclosure;
[0022] Figure 8 illustrates a front view of the robotic unit with the three-link manipulator, in accordance with an embodiment of the present disclosure;
[0023] Figure 9A illustrates an isometric view of a first link of the robotic unit with the three-link manipulator, in accordance with an embodiment of the present disclosure;
[0024] Figure 9B illustrates a trimetric view of the first link of the robotic unit with the three-link manipulator, in accordance with an embodiment of the present disclosure;
[0025] Figure 10A illustrates an isometric view of a second link of the robotic unit with the three-link manipulator, in accordance with an embodiment of the present disclosure;
[0026] Figure 10B illustrates a trimetric view of the second link of the robotic unit with the three-link manipulator, in accordance with an embodiment of the present disclosure;
[0027] Figure 11 illustrates a third link of the robotic unit with the three-link manipulator, in accordance with an embodiment of the present disclosure;
[0028] Figure 12 illustrates a communication framework between the computing unit and the handheld probe unit, according to an embodiment of the present disclosure; and
[0029] Figure 13 illustrates a subject’s computing unit located at the remote unit, according to an embodiment of the present disclosure.
[0030] Further, skilled artisans will appreciate that elements in the figures are illustrated for simplicity and may not have necessarily been drawn to scale. Furthermore, in terms of the construction of the device, one or more components of the device may have been represented in the figures by conventional symbols, and the figures may show only those specific details that are pertinent to understanding the embodiments of the present disclosure so as not to obscure the figures with details that will be readily apparent to those of ordinary skill in the art having the benefit of the description herein.
DETAILED DESCRIPTION
[0031] For the purpose of promoting an understanding of the principles of the invention, reference will now be made to the embodiments illustrated in the figures and specific language will be used to describe the same. It will nevertheless be understood that no limitation of the scope of the invention is thereby intended, such alterations and further modifications in the illustrated system, and such further applications of the principles of the invention as illustrated therein being contemplated as would normally occur to one skilled in the art to which the invention relates.
[0032] It will be understood by those skilled in the art that the foregoing general description and the following detailed description are exemplary and explanatory of the invention and are not intended to be restrictive thereof.
[0033] The terms "comprise", "comprising", or any other variations thereof, are intended to cover a non-exclusive inclusion such that a process or method that comprises a list of steps does not comprise only those steps but may comprise other steps not expressly listed or inherent to such a process or a method. Similarly, one or more devices or sub-systems or elements or structures or components proceeded by "comprises... a" does not, without more constraints, preclude the existence of other devices or other sub-systems or other elements or other structures or other components or additional devices or additional sub-systems or additional elements or additional structures or additional components. Appearances of the phrase “in an embodiment”, “in another embodiment” and similar language throughout this specification may, but do not necessarily, all refer to the same embodiment.
[0034] Unless otherwise defined, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. The system, methods, and examples provided herein are illustrative only and not intended to be limiting.
[0035] In addition to the illustrative aspects, exemplary embodiments, and features described above, further aspects, exemplary embodiments of the present disclosure will become apparent by reference to the drawings and the following detailed description.
[0036] In some embodiments, the word ‘doctor’, ‘expert’, ‘healthcare provider’ and ‘radiologist’ used in the description may reflect the same meaning and may be used interchangeably that refers to the person who conducts the scanning of the subject’s body from the remote location. Embodiments of the present disclosure will be described below in detail with reference to the accompanying figures.
[0037] The system as disclosed is an ultrasound system with a pool of radiologists to serve subjects at remote locations with centers and mobile setups ultimately leading to reducing the gaps in healthcare delivery.
[0038] The system as disclosed herein helps in performing ultrasound imaging procedures even from remote locations, where a radiologist can manipulate and control all the movements of a probe such as rocking, tilting, and rotation with the help of a mock probe at the site of the doctor. In this disclosure, the term a remote location is used, for the sake of brevity, to mean that the healthcare provider, for example, a radiologist, is not in the immediate vicinity of the subject. For example, in cases of contagious diseases, the subject could be in one room and the radiologist in the next room. In another example, the radiologist may be in one city and the subject is in another city or a village, for example, where the disclosed robotic equipment is located.
[0039] This system is integrated with wireless connectivity, ensuring the seamless transmission of data between the doctor’s site and the subject site.
[0040] The present system includes modules such as biometric authentication to access the device, a global positioning system to track the device, data encryption to avoid any data leakage, haptic feedback on the touch and the pressure of the ultrasound probe on the subject, limit on the amount of pressure that can be applied on the subject ensuring the safety of the subject, avoiding any contact of the robotic arm with the subject apart from the probe, video conferencing system for communication between the doctor and the subject, masking of sex organs to prevent any sex determination and other measures to ensure the safety and better imaging for diagnosis.
[0041] This Artificial intelligence-based system includes masking of genital organs of the fetus in the displayed image and restricts the chance of pre-natal sex determination during the scanning. Each of the disclosed systems is tracked through Global Positioning System (GPS) to confirm the implementation of Pre-Conception and Pre-Natal Diagnostic Techniques (PCP-NDT).
[0042] Embodiments of the present disclosure will be described below in detail with reference to the accompanying drawings.
[0043] Figure 1 illustrates a block diagram 100 depicting a system to perform ultrasound imaging, in accordance with an embodiment of the present disclosure. System 100 aims to address the challenges associated with conventional ultrasound imaging by introducing innovative solutions for remote operability and improved accessibility. The objective of the present disclosure is to enable precise and autonomous ultrasound imaging that overcomes limitations in probe maneuverability, reduces dependency on skilled personnel, and ensures effective diagnostics even in remote or underserved regions. System 100 facilitates real-time, high-quality imaging for diverse clinical applications, thereby bridging critical gaps in healthcare delivery and improving patient outcomes.
[0044] System 100 may include a robotic unit 102, and a remote-control station 104. The robotic unit 102 may be communicated with the remote-control station 104 over network 108. Network 108 may include a communication infrastructure enabling data exchange between the robotic unit 102 and the remote-control station 104. The network 108 may include but is not limited to, wired networks (such as Ethernet or fiber-optic connections, providing high-speed and reliable communication channels), wireless networks (Such as Wi-Fi, cellular networks (e.g., 4G, 5G), or satellite networks, allowing flexible and remote connectivity), Local Area Networks (LAN), Wide Area Networks (WAN), Private or Dedicated Networks, and the like. The remote-control station 104 may be a user-operated interface system configured to control and monitor the robotic unit 102 performing ultrasound imaging. For example, the remote-control station 104 may enable a radiologist or clinician to perform ultrasound imaging while maintaining precise control and receiving detailed feedback for accurate diagnostics.
[0045] The robotic unit 102 may be stationed at a subject’s location. The subject’s location may include, but is not limited to, a remote location, rural healthcare facilities, mobile diagnostic units, clinics where trained professionals may not be readily available, and the like.
[0046] The robotic unit 102 may be configured to perform an actual ultrasound scanning by manipulating an ultrasound probe (not shown in Figure 1) across a body surface of the subject. The subject indicates an individual or patient undergoing ultrasound imaging or diagnostic procedures. The robotic unit 102 may perform under control of commands received from the remote-control station 104, operated by a user. The user may include, but is not limited to, a radiologist, a doctor, a clinician, and the like. The robotic unit 102 may include a geo-tagging module 112 and a blockchain module 114. The geo-tagging module 112 may be configured to detect and record the location of the robotic unit 102. The geo-tagging module 112 may be a hardware and/or software component integrated into the robotic unit 102.
[0047] The geo-tagging module 112 may be configured to detect the real-time geographic location of the robotic unit 102 using satellite navigation systems (for example, Global Positioning System (GPS), Global Navigation Satellite System (GLONASS), and the like. For example, if the robotic unit 102 is moved outside the authorized location, the robotic unit 102 locks the functionality to prevent unauthorized use. Unauthorized location changes may trigger immediate notifications to emergency contacts. The emergency contacts may include, but are not limited to, clinician overseeing the robotic unit, district health authorities, hospital administrators, other relevant personnel, and the like. The geo-tagging module 112 may be configured to log location changes and unauthorized access attempts for future review.
[0048] The blockchain module 114 may be a software component integrated with the robotic unit 102, enabling secure and decentralized data handling. The blockchain module 114 may be configured to interface with the robotic unit 102 to store, manage, and share critical data such as medical records and tele-diagnostic results collected or processed by the robotic unit 102 during one or more operations.
[0049] The remote-control station 104 may be configured to enable the user to control the one or more operations of the robotic unit 102 from the remote location using a computing unit 106. The computing unit 106 may include, but is not limited to, a laptop, a desktop computer, an embedded system, a mini personal computer, a server, a tablet or mobile device, an edge computing device, a dedicated medical device controller, and the like. The one or more operations may include, but is not limited to, an ultrasound probe movement, pressure adjustment, probe positioning, and orientation, image acquisition and scanning parameters, feedback reception and response, emergency overrides, calibration and customization, and the like.
[0050] The remote-control station 104 may include a control panel 110 configured to adjust one or more settings of a robotic ultrasound probe. The one or more settings may include depth, frequency, imaging modes, and the like. The depth may refer to the penetration level of ultrasound waves, adjusted to focus on specific layers of tissues or organs at varying distances. The frequency may determine the resolution and penetration of ultrasound waves, higher frequencies provide better resolution but lower penetration, while lower frequencies penetrate deeper with reduced resolution. The imaging mode may Include various visualization techniques such as B-mode (brightness mode for 2D imaging), Doppler (for blood flow analysis), and M-mode (motion mode for dynamic assessments). The robotic ultrasound probe may be a specialized component of robotic unit 102 configured to conduct the ultrasound imaging in a precise, automated manner. The robotic ultrasound probe may be configured to integrate advanced robotic and imaging technologies to provide real-time diagnostic capabilities, particularly in telemedicine and remote healthcare scenarios. Further, remote-control station 104 may include a handheld probe unit 116 and a curved haptic pad 118.
[0051] The handheld probe unit 116 may include sensors 120 configured to detect one or more spatial parameters based on the hand movement of the user. Handheld probe unit 116 may be configured to transmit one or more spatial parameters to the robotic unit 102 for controlling movement of the robotic ultrasound probe along the body surface of the subject. The curved haptic pad 118 may be positioned in conjunction with the handheld probe unit 116. The curved haptic pad 118 may be configured to allow the user to apply pressure beyond a flat surface of the curved haptic pad 118 and provide pressure to the user based on feedback signals. The feedback signals may indicate physical interactions experienced by the robotic ultrasound probe on the body surface of the subject. The curved haptic pad 118 may be configured to receive feedback signals from the robotic unit 102 through the computing unit 106.
[0052] Figure 2 illustrates a block diagram of computing unit 106 for performing ultrasound imaging, in accordance with an embodiment of the present disclosure. In an embodiment, computing unit 106 may be located at the remote-control station 104. In another embodiment, computing unit 106 may be located at the robotic unit 102.
[0053] Referring to Figure 2, computing unit 106 may include, but is not limited to, memory 202, a processor 204, an interface 206, and a plurality of modules 208. Memory 202, the interface 206, and the plurality of modules 208 may be coupled to the processor 204. In an embodiment, the plurality of modules 208 may include a network communication module 210, an artificial intelligence module 212, and a biometric authentication module 214.
[0054] The processor 204 can be a single processing unit or several units, all of which could include multiple computing units. The processor 204 may be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, state machines, logic circuitries, and/or any device that manipulates signals based on operational instructions. Among other capabilities, processor 204 is configured to fetch and execute computer-readable instructions and data stored in memory 202.
[0055] In an embodiment of the present disclosure, the memory 202 may include any non-transitory computer-readable medium known in the art including, for example, volatile memory, such as static random-access memory (SRAM) and dynamic random-access memory (DRAM), and/or non-volatile memory, such as read-only memory (ROM), erasable programmable ROM, flash memories, hard disks, optical disks, and magnetic tapes. Further, memory 202 may include an operating system for performing one or more tasks of computing unit 106, as performed by a generic operating system in the communications domain. Further, computing unit 106 may be configured to enable the user to control the one or more operations of the robotic unit 102 from the remote location. Database 216 may include one or more parameters, the data-driven models, Machine Learning (ML) models, and the like. The plurality of modules 208 amongst other things, includes routines, programs, objects, components, data structures, etc., which perform particular tasks or implement data types. The plurality of modules 208 may also be implemented as signal processor(s), state machine(s), logic circuitries, and/or any other device or component that manipulates signals based on operational instructions.
[0056] Further, the plurality of modules 208 can be implemented in hardware, instructions executed by a processing unit, or by a combination thereof. The processing unit can comprise a computer, a processor, such as the processor, a state machine, a logic array, or any other suitable wearable device capable of processing instructions. The processing unit can be a general-purpose processor which executes instructions to cause the general-purpose processor to perform the required tasks or, the processing unit can be dedicated to performing the required functions. In another embodiment of the present disclosure, the plurality of modules 708 may be machine-readable instructions (software) that, when executed by a processor/processing unit, perform any of the described functionalities.
[0057] In some embodiments, the plurality of modules 208 may include a set of instructions that may be executed to cause the computing unit 106 to perform any one or more of the operations disclosed herein. The plurality of modules 208 may be configured to perform the steps of the present disclosure using the data stored in memory 202, as discussed throughout this disclosure. In an embodiment, each of the plurality of modules 208 may be hardware units that may be outside memory 202.
[0058] In an embodiment, the network communication module 210 may be configured to establish a secure, encrypted connection between the robotic unit 102 with the remote-control station 104 to enable real-time data exchange and control.
[0059] In an embodiment, the artificial intelligence module 212 may be configured to identify anatomical landmarks of the subject and autonomously perform scanning procedures. The anatomical landmarks may include, but are not limited to, heart, liver, abdominal organs, and the like. The artificial intelligence module 212 may be configured to mask sensitive regions during ultrasound scans of the subject. The artificial intelligence module 212 may be configured to detect and highlight anomalies in one or more ultrasound images based on machine learning techniques with anomaly detection capabilities.
[0060] The one or more ultrasound images may refer to diagnostic medical images generated using ultrasonic sound waves by the robotic unit 102 equipped with advanced imaging technology. The one or more ultrasound images capture internal structures of the human body in real-time, providing non-invasive and detailed visualizations of tissues, organs, and abnormalities. The artificial intelligence module 212 may be configured to perform automated two-dimensional echocardiogram analysis for cardiac anomaly detection. The two-dimensional echocardiogram analysis may include, but is not limited to, scans and visualize cardiac structures, identifying and assessing liver morphology and potential abnormalities, and the like.
[0061] The artificial intelligence module 212 may include templates for common ultrasound scans, allowing the robotic unit 102 to follow optimized scanning paths. The artificial intelligence module 212 may dynamically adjust the robotic ultrasound probe positioning and angles based on patient anatomy and real-time imaging feedback. The artificial intelligence module 212 may be configured to ensure that the robotic ultrasound probe pressure and motion stay within safe thresholds using integrated force sensors and real-time artificial intelligence monitoring. The artificial intelligence module 212 may be configured to analyze ultrasound data to detect deviations from normal anatomical structures or expected tissue patterns. Detected anomalies may be highlighted and overlaid directly onto the ultrasound image with labels or graphical markers.
[0062] Further, the artificial intelligence module 212 may be configured to transmit enhanced images including overlays to the computing unit 106 at the remote-control station 104. The enhanced images may provide immediate diagnostic assistance to the user. The artificial intelligence module 212 may be configured to adjust the position of the robotic ultrasound probe and angle automatically based on the detected anomalies to improve imaging clarity of the area of concern. The artificial intelligence module 212 may be configured to allow the user on the computing unit 106 to adjust overlay information, such as size, location, and suspected diagnosis for clarity.
[0063] The biometric authentication module 214 may be configured to verify the identity of the user before enabling access to the handheld probe unit 116 and the curved haptic pad 118. For example, biometric authentication may refer to a security process that uses the user’s unique biological or behavioral characteristics to verify the identity. Biometric authentication may include, but is not limited to, biological traits, behavioral traits, and the like. The biological traits may include, but are not limited to, fingerprint recognition, iris recognition, retina scanning, facial recognition, voice recognition, and the like. The behavioral traits may include, but are not limited to, voice recognition, gait analysis, keystroke dynamics, and the like. The handheld probe unit 116 and the curved haptic pad 118 have further been explained in detail with reference to Figures 4 and Figure 5. A platform may be configured to facilitate forward and backward translational adjustments and enable manipulation of the robotic ultrasound probe to cover an anatomical range of the subject from head to toe for effective imaging.
[0064] The plurality of modules 208 may be in communication with each other. In an embodiment, the plurality of modules 208 may be a part of processor 204. In another embodiment, processor 204 may be configured to perform the functions of modules 208.
[0065] At least one of the modules 210, 212, and 214 may be implemented through the data-driven model. A function associated with the data-driven model may be performed through the non-volatile memory, the volatile memory, and the processor 204. Accordingly, processor 204 may include a plurality of processors. At this time, the plurality of processors may be a general-purpose processor, such as a central processing unit (CPU), an application processor (AP), or the like, a graphics-only processing unit such as a graphics processing unit (GPU), a visual processing unit (VPU), and/or an AI-dedicated processor such as a neural processing unit (NPU). The plurality of processors controls the processing of the input data in accordance with a predefined operating rule or data-driven model stored in the non-volatile memory and the volatile memory. The predefined operating rule or artificial intelligence model is provided through training or learning. Reasoning prediction is a technique of logical reasoning and predicting by determining information. It includes knowledge-based reasoning, optimization prediction, preference-based planning, and recommendation.
[0066] Figure 3 illustrates a schematic representation of the remote-control station 104, in accordance with an embodiment of the present disclosure. The remote-control station 104 may include a first screen 302, a second screen 304, a camera 306, the control panel 110, the handheld probe unit 116, and the curved haptic pad 118. The one or more settings may include depth, frequency, and imaging modes. The remote-control station 104 may be configured to enable the user to control the one or more operations of the robotic unit 102. The first screen 302 may be configured to provide ultrasound real-time feed. The second screen 304 may be configured to provide real-time video of the subject. The camera 306 may be configured to capture the user in real-time. The control panel 110 may be configured to adjust one or more settings of the robotic ultrasound probe.
[0067] Figure 4 illustrates a schematic representation of the handheld probe unit 116 adapted to couple to the robotic unit 102 for performing the ultrasound imaging, in accordance with an embodiment of the present disclosure. The handheld probe unit 116 may include one or more sensors to detect one or more spatial parameters based on the hand movement of the user. The one or more sensors are configured to provide haptic feedback to the user based on the pressure that is being applied to the subject. Further, the handheld probe unit 116 may include a plurality of vibrators activated in response to one or more events. The one or more events may include an emergency, power loss, and internet disconnection. The handheld probe unit 116 may include a movable component 402 configured to slide within a stationary outer housing 404 of the handheld probe unit 116 in response to the applied pressure. Further, the handheld probe unit 116 may include an ergonomic grip 406 to perform a prolonged operation of the handheld probe unit 116. The handheld probe unit 116 may include a communication unit 408 adapted to transmit one or more spatial parameters to the robotic unit 102 for controlling the movement of the robotic ultrasound probe along the body surface of the subject. The communication unit 408 may be a device equipped with hardware and software components configured to send and/or receive information. For example, the handheld probe unit 116 uses a Wireless-Fidelity (Wi-Fi) module to wirelessly transmit spatial parameters to the robotic unit 102.
[0068] Figure 5 illustrates a schematic representation of the curved haptic pad 118, in accordance with an embodiment of the present disclosure. The curved haptic pad 118 may include a handheld probe calibrator 502, a haptic sensory pad 504, a curved wall 506, a control member 508, and a sensitivity adjuster 510. The curved haptic pad 118 may be positioned in conjunction with the handheld probe unit 116. The curved haptic pad 118 may be configured to allow the user to apply pressure beyond a flat surface of the curved haptic pad 118 and provide pressure to the user based on feedback from the robotic unit 102 via the computing unit 106.
[0069] The handheld probe calibrator 502 may be configured to hold the handheld probe unit 116 to provide consistent and reliable feedback to the user. The curved wall 506 may be configured to enable the user to intuitively replicate the angles of the handheld probe unit 116. The curved wall 506 may include, but is not limited to, a haptic sensory curved wall, and the like. The control member 508 may be configured to control the three-link manipulator 602. The control member 508 may include, but is not limited to, a joystick, a touch control, and the like. The sensitivity adjuster 510 may be configured to enable real-time calibration of control responsiveness, allowing the user to fine-tune the sensitivity of the input controls based on operational requirements.
[0070] Figure 6A and Figure 6B illustrate schematic representations of the robotic unit 102, in accordance with an embodiment of the present disclosure. Specifically, Figure 6A shows a side view of robotic unit 102 whereas Figure 6B shows a front view of the robotic unit 102. The robotic unit 102 may include a three-link manipulator 602, a cantilever beam 604, a column 606, a platform 608, a communication interface module 610a and 610b, a foldable joint mechanism 612, a swivel joint mechanism 614, wheels 616, a robotic ultrasound probe 618, and a handle 620.
[0071] Column 606 may be disposed of on the platform 608. Column 606 may be adapted to slide along the length of the platform 608. The cantilever beam 604 may be extended from a side of column 606 and adapted to move along a length of column 606 and relative to the platform 608. The three-link manipulator 602 may be slidably mounted to a distal end of the cantilever beam 604 via a support member 622. The foldable joint mechanism 612 may be configured to enable the robotic unit 102 to fold into a compact form. The swivel joint mechanism 614 may be positioned between column 606 and the platform 608. The swivel joint mechanism 614 may be configured to provide flexibility in positioning the robotic ultrasound probe 618 on either side of the subject.
[0072] Column 606 may be configured to provide vertical adjustments for optimal positioning of the robotic ultrasound probe 618. The three-link manipulator 602 may be configured to translate along the cantilever beam 604 to cover a horizontal sagittal plane of the subject (for example, X-Axis). The cantilever beam 604 may be configured to translate vertically along column 606 to adjust for the anatomy of the subject (for example, Y-Axis). Column 606 may be configured to move along the platform 608 for forward and backward adjustments (for example, Z-Axis).
[0073] The wheels 616 may be configured to enable the robotic unit 102 to be moved and deployed across one or more locations. The handle 620 may be a mechanical component configured to facilitate manual maneuvering of the robotic unit (102) by healthcare providers or operators. Handle 620 may be configured to provide an ergonomic interface for physically moving the robotic unit 102 to required locations, particularly in environments where automated navigation may not be practical or feasible. The communication interface module 610a and 610b may be configured to enable real-time bidirectional audio and video interaction between the user at the remote-control station 104 and the subject. The communication interface module 610a and 610b may be configured to enable the user to examine the one or more ultrasound images. The communication interface module 610a and 610b may include, but is not limited to, a touchscreen display, an output device, A standalone screen, a screen in an Internet of Things (IOT)-enabled device, a screen with network connectivity, and the like.
[0074] Figure 7 illustrates a schematic representation of the three-link manipulator 602, in accordance with an embodiment of the present disclosure. The three-link manipulator 602 may include a first link 702, a second link 704, a third link 706, the robotic ultrasound probe 618, and an excess force absorber 708.
[0075] The first link 702 may include a proximal end 710 rotatably coupled to the support member 622 and a distal end 712 formed at a predefined angle with respect to the proximal end 710 of the first link 702. The second link 704 may include a proximal end 714 rotatably coupled to the distal end 712 of the first link 702 and the distal end 716 formed at the predefined angle with respect to the proximal end of the second link 704. The third link 706 may include a proximal end 718 rotatably coupled to the distal end 716 of the second link 704 and a distal end 720 opposite to the proximal end 718 of the third link 706 and having a mount. The proximal ends 710, 714, and 718 of the first link 702, the second link 704, and the third link 706 may be coaxial. The first link 702 may include typically cylindrical or tubular, providing structural strength while allowing smooth rotational and translational motion. The second link 704 may include rectangular or articulated with a curved profile to enhance reach and flexibility while maintaining a compact form factor. The third link 706 may include Tapered or modular, designed for fine control and precision. The third link 706 may include a flexible or segmented structure for enhanced adaptability.
[0076] The first link 702, the second link 704, and the third link 706 may be collectively configured to allow movement of the robotic ultrasound probe 618 along the body surface of the subject. The third link 706 may include at least one force sensor configured to sense an amount of force being applied to the body surface of the subject and the excess force absorber 708 configured to absorb and dissipate force exceeding a predefined threshold value. The excess force absorber 708 may be configured to protect the subject by mitigating unintentional excess pressure from the robotic ultrasound probe 618.
[0077] The excess force absorber 708 may be configured to exhibit controlled motion across various Degrees of Freedom (DOF), enabling the excess force absorber 708 to handle forces from different directions without compromising the functionality of the system 100. For example, the excess force absorber 708 may move along the X, Y, or Z axes to counteract linear forces applied to the robotic unit 102. The excess force absorber 708 can absorb impact forces resulting from sudden probe movements or external interference.
[0078] The excess force absorber 708 may be capable of rotational adjustments around axes, allowing to counteract torque or twisting forces. This motion may be critical when the robotic ultrasound probe 618 encounters uneven or curved surfaces during operation. The excess force absorber 708 may include mechanisms such as springs, dampers, or elastic materials that compress under force and expand back to an original position once the force is dissipated.
[0079] Figure 8 illustrates a front view of robotic unit 102 with the three-link manipulator 602 as shown in Figure 6A and Figure 6B. In one exemplary aspect, the three-link manipulator 602 may include the first link 702, the second link 704, and the third link 706. The three-link manipulator 602 may include a first motor 802a, a second motor 802b, and a third motor 802c. In an embodiment, the motors 802a, 802b, and 802c may be configured to allow the robotic ultrasound probe 618 to be operated remotely under the guidance of the user in various angles and dimensions for producing an ultrasound image of the subject’s body. The first link 702 may be connected to motor 802a with the help of joint 804. The second motor 802b may be mounted on the first link 702. The first link 702 may be connected to a gear system 806. The gear system 806 may be configured to precisely transfer the rotator motion from one point to the other. The second link 704 of the manipulator 602 may be connected to the gear system 806 using a joint 808. The third motor 802c may be mounted on the second link 704. The prismatic link 810 may be connected to the third motor 802c with the help of the joint 814. The joints 804, 808, and 812 allow flexible movement in a plane and rotary movement at different angles. In one aspect, the Prismatic link 810 may be used for a linear motion in one direction, and the prismatic link 810 may help to translate linear motion between the third link 706 and the end effector 816. In one aspect, the end effector 816 is the device at the end of the three-link manipulator 602, designed to interact with the environment. The end effector 816 may be connected to the prismatic link 810 with help of a joint 814. The robotic ultrasound probe 618 may be held by the end effector 816.
[0080] Figure 9A and Figure 9B illustrate different views of the first link 702 of the robotic unit 102 with the three-link manipulator 602. The first link 702 may be coupled to the joint 804 as shown in Figure 8 using a coupling hub 902 which is attached to the link mount 904. The link mount 904 may be placed in between two identical plates 908 with multiple mounting holes 906, which helps in configuring various other components with the first link 702. The mounting hole 910 may be configured for coupling the second link 704 as shown in Figure 8 and 9A-9B using joint 808.
[0081] Figure 10A and Figure 10B illustrate different views of the second link 704 of the robotic unit 102 with the three-link manipulator 602. The second link 704 may be configured with multiple holes 1002 to accommodate various mountings making the design modular. A coupling hub 1008 is mounted on the mounting plate 1006 as shown in Figure 10A to attach the entire link to joint 6 of Figure 9. Further, another mounting plate 1004 is placed on the other end of the link to mount the piston (9). Figure 10B includes the mounting plates 1006 and 1004.
[0082] Figure 11 illustrates the third link 706 of robotic unit 102 with the three-link manipulator 602. The third link 706 is shown along with the prismatic joint. The third link 706 may include a piston 1102 and a cylinder 1104 that acts as a hydraulic actuation unit to move the end effector 816 which is attached to the piston 1102 with the help of the joint 814. The joint 814 may include two hinges 1106a and 1106b which will help in altering the end effector 816 gripping area.
[0083] Figure 12 illustrates a communication framework 1200 between the computing unit 106 and the handheld probe unit 116, according to an embodiment of the present disclosure. The computing unit 106 may include the processor 204, a data encryption and decryption unit 1202, a local server 1204, a manipulator simulation 1206, an ultra-sound visualization unit 1208, and additional feature extraction and confidence score unit 1210, and an ultra-sound control unit 1214. The handheld probe unit 116 may include a microcontroller 1212. The microcontroller 1212 is connected to the pressure unit 1216, globe probe positioner 1218, and local probe orientation unit 1220. In one aspect, the pressure unit 1216 may be connected with the haptic input 1222 which transmits tactile information using sensations such as vibration, touch, and force feedback. The pressure unit 1216 also controls the pressure to be applied on the subject’s body while performing ultrasound imaging process from the remote location. In one aspect, the angle of pressure applied on the subject’s body ranges from -90 to +90 depending on the surface of the subject’s body. In one aspect, local probe orientation unit 1220 is a mock probe unit provided to the user to move the robotic ultrasound probe 618 at the subject’s end.
[0084] The data encryption and decryption unit 1202 may be configured to encrypt and decrypt sensitive data, such as the ultrasound images and diagnostic results, ensuring secure communication between the remote-control station 104 and the expert system. The local server 1204 may be configured to temporarily store ultrasound data and enable real-time computation to support operations like visualization and feature extraction.
[0085] The manipulator simulation 1206 may be a software-based simulation tool configured to predict the movement of the three-link manipulator 602. The manipulator simulation 1206 may assist the user in remotely controlling and adjusting the robotic ultrasound probe 618 with precision. The ultrasound visualization unit 1208 may be a dedicated interface configured to render real-time ultrasound images, enabling the user to view, analyze, and interpret diagnostic data. The additional feature extraction and confidence score unit 1210 may be a module configured to analyze the ultrasound images to extract diagnostic features, such as tissue boundaries or abnormalities, and calculate a confidence score for the diagnostic results. The ultrasound control unit 1214 may include a software module configured to manage technical parameters of the ultrasound imaging, such as frequency, power, resource settings, and ensuring optimal image quality.
[0086] The microcontroller 1212 may be a compact integrated circuit that directly interfaces with and controls peripheral devices such as the pressure unit 1216, the haptic input 1222, the globe probe positioner 1218, and the local probe orientation unit 1220. The pressure unit 1216 may be configured to regulate the amount of pressure applied by the robotic ultrasound probe 618 on the subject’s body surface. Depending on the subject’s body surface, the pressure may be adjusted within an angle of -90 to +90 degrees. The haptic input 1222 may be connected to the pressure unit 1216. The haptic input 1222 may be configured to transmit tactile feedback, such as vibrations, touch, or force feedback, to the user. The globe probe positioner 1218 may be a mechanism that determines and adjusts the global position of the robotic ultrasound probe 618 on the subject’s body to ensure comprehensive imaging coverage. The local probe orientation unit 1220 may be configured to replicate the movements of a mock probe. The local probe orientation unit 1220 may translate expert inputs into real-time orientation adjustments of the robotic ultrasound probe 618 on the subject’s body. A phantom or game control 1224 may be a control interface configured for high-precision manipulation of the robotic unit 102 during diagnostic or operational tasks. The phantom or game control 1224 may be configured to mimic a virtual or physical phantom environment to allow the user to simulate or execute movements with fine control. The control member 508 may be a user interface element configured to allow the user to control the robotic unit 102.
[0087] Figure 13 illustrates a subject’s computing unit 1300 located at the remote unit 102, according to an embodiment of the present disclosure. The subject’s computing unit 1300 may include a processor 1302, a data encryption and decryption unit 1304, the local server 1204, an emergency unit 1306, a proximity sensor 1308, pressure sensor 1310, an environmental analyzer 1312, audio feed 1314, video feed 1316, motor drivers 1326, an expert side audio/video unit 1328, Subject Feedback unit 1330, a system auto-calibration 1332, Gantry system motors 1334, local system motors 1336. The emergency unit 1306 may be connected to a failure position configuration unit 1318, a manipulator direct control unit 1320, a system abort switch unit 1322, and an auxiliary power unit 1324. The subject sub-system is configured for carrying out ultrasound imaging from a remote location with the help of an expert.
[0088] The processor 1302 may be configured to control and coordinate the operations of other components in the subject’s computing unit 1300. The processor 1302 may execute algorithms for processing data received from various sensors, feeds, and control units. The data encryption and decryption unit 1304 may be responsible for securely encrypting sensitive data (e.g., patient information, diagnostic results) and decrypting incoming data for safe transmission and processing. The local server 1204 may act as a storage and processing hub for data locally generated by the robotic unit 102. The local server 1204 may support real-time processing and reduce dependency on external servers for immediate tasks. The emergency unit 1306 may be a dedicated system for managing emergency scenarios, such as hardware malfunctions or power failures. The emergency unit 1306 may work in tandem with failure management components like the failure position configuration unit 1318 and the system abort switch unit 1312.
[0089] The proximity sensor 1308 may be configured to detect the distance between robotic unit 102 and surrounding objects or the subject’s body. The proximity sensor 1308 may ensure safe and accurate positioning during operation. The pressure sensor 1310 may be configured to monitor the pressure applied by the robotic unit 102 on the subject’s body to prevent excessive force and ensure safe operation during procedures. The environmental analyzer 1312 may be configured to measure environmental factors like temperature, humidity, and air quality in the operating area. Further, the audio feed 1314 may be configured to capture and transmit audio from the subject’s end to the user. The audio feed 1314 may facilitate real-time communication and auditory analysis during the procedure. Further, the video feed 1316 may provide live video from the subject’s location for visual assessment by the user. The video feed 1316 may enhance the user’s situational awareness during procedures. The motor driver 1326 may be configured to control the motion of the robotic unit’s motors, enabling precise movements of components like probes and manipulators. The expert side audio or video unit 1328 may handle the transmission and reception of audio and video feeds between the user and the subject. The subject feedback unit 1330 may be configured to collect and transmit feedback from the subject, such as discomfort levels or physiological responses, to the user. Further, the subject feedback unit 1330 may help in adjusting the robotic unit’s operation for better patient care. Furthermore, the system auto-calibration unit 1332 may be configured to automatically adjust and calibrate the robotic unit’s sensors and actuators to ensure accuracy and reliability during procedures. The gantry system motors 1334 may be configured to drive the motion of the robotic unit’s gantry system, allowing for the positioning and alignment of the robotic components. Furthermore, the local system motors 1336 may be configured to control smaller, localized motions within the robotic unit 102, such as adjusting the probe angle or fine-tuning the position of diagnostic tools.
[0090] The failure position configuration unit 1318 may be configured to automatically position the robotic unit 102 to a predefined safe configuration during failures, minimizing risks and damage. The Manipulator Direct Control Unit 1320 may be configured to allow for manual or direct control of the three-link manipulator 602 during emergency scenarios, bypassing automated systems. Further, the System Abort Switch Unit 1322 may be configured to provide an immediate shutdown mechanism for halting the operations in case of critical issues. The auxiliary power unit 1324 may be configured to supply backup power to the robotic unit 102, ensuring continuous operation during power outages or failures.
[0091] In one aspect, the environmental analyzer 1312 may be connected with an encoder 1338 and a Light Detection and Ranging or Laser Imaging, Detection, and Ranging (LIDAR) 1340. The environmental analyzer 1312 may ensure that the robotic ultrasound probe 618 is always in contact with the subject’s skin, reducing cognitive load for the expert/ doctor with the consistent quality of the ultrasound.
[0092] In one aspect, the emergency unit 1306 may be connected with the auxiliary power unit 1324 to ensure that in case of power failure, the user may be able to continue with current examination.
[0093] A probe focused mode 1316a may be the camera configured to capture and transmit a detailed, close-up view of the robotic ultrasound probe 618 and immediate surroundings of the robotic ultrasound probe 618. The probe focussed 1316a may be particularly useful for precise monitoring of the probe’s position, orientation, and interaction with the target area on the body surface of the patient. A subject focussed mode 1316b may be the camera configured to capture a broader view, showing the subject’s body and the overall interaction with the robotic unit 102. The subject focussed mode 1316b may allow the user to monitor the user’s posture, movements, and general condition during the procedure.
[0094] The best method of working of the present disclosure is disclosed herein. Initially, an expert sitting anywhere in the world with an active internet connection can perform an ultrasound on a subject at a different location. The expert’s identity is authenticated by a doctor through biometric sensors. In the next step, the expert provided a controller on his table which lets him move the ultrasound probe remotely in X-axis, Y-axis and Z-axis direction and perform rocking, tilting and rotation motion using it. Further, the expert is provided with the control to perform the ultrasound in the complete visible section of the abdominopelvic region, chest, arms, legs, and neck in a safe manner irrespective of the size of the subject. In the next step, the expert to move the controller probe in a curved shaped structure and movement of rocking, tilting, and rotation on every point of the curved surface, the expert pushes the probe on the curved space structure to apply pressure at the abdomen and the controller probe uses a load cell to calculate this pressure. The Pressure for angles that can range from -90 to +90 from the normal to the surface of the subject’s body is applied with the probe’s position being the center of the curved structure. In the case of angles greater than or equal to 90 or less than equal to -90 he uses the walls/edges of the curved structure.
[0095] In the next step, the translation about X-axis, Y-axis of the robotic ultrasound probe in the cartesian coordinate system is controlled by a 2D position controller which can make it move in a rectangular workspace. Further, one more autonomous Degree of Freedom (DOF) is added to make the probe translate about the Z-axis. The autonomous DOF ensures that the probe always stays in contact with the abdomen while performing the translation about x and y until the doctor tries to control it separately.
[0096] In the next step, the expert is provided with a scale of how much pressure he will be applying through the haptic motors ensuring a level of safety. The controller is connected to a Computing device. The expert’s computing device runs on software for controlling the robotic arm. The displays at the expert side showcase the Ultrasound Image, Ultrasound Machine Controls, Subject’s side video and audio, settings and additional controls of the robotic arm and visualization of the probe movement and position through simulation.
[0097] In the next step, for a pre-natal ultrasound, the software masks the sex of the fetus, therefore, eliminating the chances of sex determination. The computing device with integration of software, a camera and a mic are connected which relay the expert’s live video and audio for ease of communication with the subject. The software also assists the expert in diagnosis as it detects morphologies using its own artificial intelligence and prompts its prediction with a confidence rating. The computing device is connected to the internet and the data of the probe’s movement is transmitted to the subject’s device through the server. The data is transmitted from the computing device is encrypted to ensure cybersecurity and privacy of the data. This data can only be decrypted at the concerned subject’s device.
[0098] In the next step, the subject’s device with the transducer is held by a gripper which is designed to hold the transducer securely unless an authorized person needs to change it. The gripper is connected to a 4 DOF link configuration which allows it to perform rocking tilting and rotation as well as apply pressure on the abdomen.
[0099] In the next step, the subject’s device is connected to another 3 DOF configuration which allows the rocking, tilting and rotation to take place at any point in the workspace of a cuboid. This kind of robot configuration makes rocking, tilting, rotation movements independent from translation thus ensuring the expert can do these movements with more precision. Once the data is transmitted from the expert’s controller and is received from the server, that data is processed using our algorithm and converted into a motor angle for the joints of this entire configuration. A clamp design is used to set and calibrate it near the bed to ensure proper positioning of the system. The clamp when pulled can be fixed on the bed to secure it and immobilize the configuration.
[00100] In addition to the present disclosure, to ensure the safety of the subject there are proximity sensors on the links of the robot which ensure that the links never hit the subject’s abdomen. Load cells are present at the prismatic joint which actuates the transducer using which the system limits maximum force that can be applied on the subject’s body. The gripper has a collision detection mechanism which when it comes in contact with the subject’s body laterally triggers a switch that will ensure that the system never hampers the subject. The subject always has a controller in his hand for emergency purposes such as communicating pain of any kind or a kill switch to slow down/ notify the attendee and expert.
[00101] In addition, the present disclosure is equipped with an auxiliary power unit, so that in case of power failure, either expert be able to continue with current examination. In the instance of poor connectivity, attendee could take over the partial control of the robotic arm after entering the security code/any security barrier.
[00102] In addition, the present disclosure will reset to home configuration in case of power failure, or poor connectivity.
[00103] In addition, the present disclosure involves a robotic arm and ultrasound system where subject sub-system is configured with Global Positioning System (GPS) through which all the machines can be tracked ensuring stronger implementation of Pre-Conception and Pre-Natal Diagnostic Techniques (PCP-NDT).
[00104] In addition, the present disclosure involves a mesh visualization and additional feature extraction using Artificial intelligence and Machine learning, could increase the expert’s confidence and masking of the genital organs of the fetus using artificial intelligence, hence restricting the chance of pre-natal sex determination.
[00105] In addition, the present disclosure has an integration of controls (mainly position and orientation of the probe) for maneuvering the probe seamlessly so that the learning curve for the device is reduced drastically. The computer can remember trajectories (positions and forces), allowing effortless repeat Scans in the same mode.
[00106] In addition, the present disclosure has an ultrasound image tracking mode, the computer can be programmed in instances in which ultrasound image features need to be recognized and tracked. For example, when a vessel such as the carotid artery needs to be scanned longitudinally, or when the motion of a needle must be tracked.
[00107] In addition, the present disclosure system monitors a precise control over the pressure applied using vector calculation. In order to ensure the safety of the subject, the maximum pressure that can be applied is mechanically limited.
[00108] In this application, unless specifically stated otherwise, the use of the singular includes the plural and the use of
[00109] “or” means “and/or.” Furthermore, use of the terms “including” or “having” is not limiting. Any range described herein will be understood to include the endpoints and all values between the endpoints. Features of the disclosed embodiments may be combined, rearranged, omitted, etc., within the scope of the invention to produce additional embodiments. Furthermore, certain features may sometimes be used to advantage without a corresponding use of other features. ,CLAIMS:WE CLAIM:
1. A system (100) for performing ultrasound imaging, comprising:
a robotic unit (102) comprising:
a platform (608);
a column (606) disposed on the platform (608), adapted to slide along a length of the platform (608);
a cantilever beam (604) extending from a side of the column (606) and adapted to move along a length of the column (606) and relative to the platform (608);
a three-link manipulator (602) slidably mounted to a distal end of the cantilever beam (604) via a support member (622), the three-link manipulator (602) comprising:
a first link (702) having a proximal end (710) rotatably coupled to the support member (622) and a distal end (712) formed at a predefined angle with respect to the proximal end (710) of the first link (702);
a second link (704) having a proximal end (714) rotatably coupled to the distal end (712) of the first link (702) and a distal end (716) formed at the predefined angle with respect to the proximal end (714) of the second link (704); and
a third link (706) having a proximal end (718) rotatably coupled to the distal end (716) of the second link (704) and a distal end (720) opposite to the proximal end (718) of the third link (706) and having a mount, configured to allow movement of a robotic ultrasound probe (618) along the body surface of a subject, wherein the third link (706) comprises at least one force sensor configured to sense an amount of force being applied to the body surface of the subject and an excess force absorber (708) configured to absorb and dissipate force exceeding a predefined threshold value; and
a remote-control station (104) configured to enable a user to control one or more operations of the robotic unit (102) from a remote location, wherein the remote-control station (104) comprises:
a handheld probe unit (116) comprising one or more sensors (120) to detect one or more spatial parameters based on a hand movement of the user, the handheld probe unit (116) configured to transmit the one or more spatial parameters to the robotic unit (102) for controlling movement of the robotic ultrasound probe (618) along the body surface of the subject; and
a curved haptic pad (118) positioned in conjunction with the handheld probe unit (116), wherein the curved haptic pad (118) is configured to allow the user to apply pressure beyond a flat surface of the curved haptic pad (118) and provide pressure to the user based on one or more feedback signals indicating physical interactions experienced by the robotic ultrasound probe (618) on the body surface of the subject, wherein the curved haptic pad (118) is configured to receive the one or more feedback signals from the robotic unit (102) via a computing unit (106).
2. The system (100) as claimed in claim 1, wherein the computing unit (106) comprises a network communication module (210) configured to establish a secure, encrypted connection between the robotic unit (102) with the remote-control station to enable real-time data exchange and control.
3. The system (100) as claimed in claim 1, wherein the computing unit (106) comprises an artificial intelligence module (212) configured to identify anatomical landmarks of the subject and autonomously perform scanning procedures, the artificial intelligence module (212) configured to mask sensitive regions during ultrasound scans of the subject.
4. The system (100) as claimed in claim 3, wherein the artificial intelligence module (212) is configured to detect and highlight anomalies in one or more ultrasound images based on machine learning techniques with anomaly detection capabilities.
5. The system (100) as claimed in claim 3, wherein the artificial intelligence module (212) is configured to perform automated two-dimensional echocardiogram analysis for cardiac anomaly detection.
6. The system (100) as claimed in claim 1, wherein the robotic unit (102) comprises a geo-tagging module (112) configured to detect and record location of the robotic unit (102).
7. The system (100) as claimed in claim 1, wherein the robotic unit (102) comprises a blockchain module (114) configured to store, manage, and share medical records and tele diagnostic services.
8. The system (100) as claimed in claim 1, wherein the robotic unit (102) comprises a foldable joint mechanism (612) configured to enable the robotic unit (102) to fold into a compact form.
9. The system (100) as claimed in claim 1, wherein the robotic unit (102) comprises a swivel joint mechanism (614) positioned between the column (606) and the platform (608), wherein the swivel joint mechanism (614) is configured to provide flexibility in positioning the robotic ultrasound probe (618) on either side of the subject.
10. The system (100) as claimed in claim 1, wherein the platform (608) is configured to facilitate forward and backward translational adjustments and enable manipulation of the robotic ultrasound probe (618) to cover an anatomical range of the subject from head to toe for effective imaging.
11. The system (100) as claimed in claim 1, wherein the robotic unit (102) comprises at least two wheels (616) configured to enable the robotic unit (102) to be moved and deployed across one or more locations.
12. The system (100) as claimed in claim 1, wherein the robotic unit (102) comprises a communication interface module (610a and 610b) configured to enable real-time bidirectional audio and video interaction between the user at the remote-control station and the subject.
13. The system (100) as claimed in claim 12, wherein the communication interface module (610a and 610b) is configured to enable the user to examine one or more ultrasound images.
14. The system (100) as claimed in claim 1, wherein the computing unit (106) comprises a control panel (110) configured to adjust one or more settings of the robotic ultrasound probe (618), wherein the one or more settings comprises depth, frequency, and imaging modes.
15. The system (100) as claimed in claim 1, wherein the proximal ends (710, 712, 718) of the first link (702), the second link (704), and the third link (706) are coaxial.
16. A handheld probe unit (116) adapted to couple to a system (100) for performing ultrasound imaging, the handheld probe unit (116) comprising:
at least one sensor adapted to detect one or more spatial parameters associated with a hand movement of a user holding the probe unit;
a communication unit (408) adapted to transmit the one or more spatial parameters to a robotic unit (102) for controlling movement of a robotic ultrasound probe (618) along the body surface of the subject; and
a curved haptic pad (118) positioned in conjunction with the handheld probe unit (116), wherein the curved haptic pad (118) is configured to allow the user to apply the pressure beyond a flat surface of the curved haptic pad (118) and provide pressure to the user based on one or more feedback signals indicating physical interactions experienced by the robotic ultrasound probe (618) on the body surface of the subject, wherein the curved haptic pad (118) configured to receive the one or more feedback signals from the robotic unit (102).
17. The system (100) as claimed in claim 16, wherein the computing unit (106) comprises a biometric authentication module configured to verify identity of the user before enabling access to the handheld probe unit (116) and the curved haptic pad (118).
18. The system (100) as claimed in claim 16, wherein one or more spatial parameters comprise orientation, position, and applied pressure.
19. The system (100) as claimed in claim 16, wherein the handheld probe unit (116) comprises the one or more sensors (120) configured to provide haptic feedback to the user based on the pressure that is being applied to the subject, wherein a plurality of vibrators activated in response to one or more events, one or more events comprises an emergency situation, power loss, and internet disconnection.
20. The system (100) as claimed in claim 16, wherein the curved haptic pad (118) comprises a control member (708) configured to control the three-link manipulator (602).
21. The system (100) as claimed in claim 16, wherein the curved haptic pad (118) comprises a sensitivity adjuster (510) configured to enable real-time calibration of control responsiveness, allowing the user to fine-tune the sensitivity of the input controls based on operational requirements.
22. The system (100) as claimed in claim 16, wherein the curved haptic pad (118) comprises a handheld probe calibrator (502)configured to hold the handheld probe unit (116) and a calibrator to provide consistent and reliable feedback to the user.
23. The system (100) as claimed in claim 16, wherein the curved haptic pad (118) comprises a curved wall (506) configured to enable the user to intuitively replicate angles of the handheld probe unit (116).
24. The system (100) as claimed in claim 16, wherein the handheld probe unit (116) comprises a movable component (402) configured to slide within a stationary outer housing (404) of the handheld probe unit (116) in response to the applied pressure.
25. The system (100) as claimed in claim 16, wherein the handheld probe unit (116) comprises an ergonomic grip (406) to perform a prolonged operation of the handheld probe unit (116).
26. A three-link manipulator (602) operably coupled to a system (100) for performing ultrasound imaging, the three-link manipulator (602) comprising:
the three-link manipulator (602) slidably mounted to a distal end of a cantilever beam (604) via a support member (622), the three-link manipulator (602) comprising:
a first link (702) having a proximal end (710) rotatably coupled to the support member (622) and a distal end (712) formed at a predefined angle with respect to the proximal end (710) of the first link (702);
a second link (704) having a proximal end (714) rotatably coupled to the distal end (712) of the first link (702) and a distal end (716) formed at the predefined angle with respect to the proximal end (714) of the second link (704); and
a third link (706) having a proximal end (718) rotatably coupled to the distal end of the second link (704) and a distal end opposite to the proximal end (718) of the third link (706) and having a mount, configured to allow movement of a robotic ultrasound probe (618) along the body surface of a subject, wherein the third link (706) comprises at least one force sensor configured to sense an amount of force being applied to the body surface of the subject and an excess force absorber (708) configured to absorb and dissipate force exceeding a predefined threshold value.
27. The three-link manipulator (602) as claimed in claim 26, wherein the proximal ends (710, 714, 718) of the first link (702), the second link (704), and the third link (706) are coaxial.
| # | Name | Date |
|---|---|---|
| 1 | 202341084539-STATEMENT OF UNDERTAKING (FORM 3) [11-12-2023(online)].pdf | 2023-12-11 |
| 2 | 202341084539-PROVISIONAL SPECIFICATION [11-12-2023(online)].pdf | 2023-12-11 |
| 3 | 202341084539-FORM FOR STARTUP [11-12-2023(online)].pdf | 2023-12-11 |
| 4 | 202341084539-FORM FOR SMALL ENTITY(FORM-28) [11-12-2023(online)].pdf | 2023-12-11 |
| 5 | 202341084539-FORM 1 [11-12-2023(online)].pdf | 2023-12-11 |
| 6 | 202341084539-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [11-12-2023(online)].pdf | 2023-12-11 |
| 7 | 202341084539-EVIDENCE FOR REGISTRATION UNDER SSI [11-12-2023(online)].pdf | 2023-12-11 |
| 8 | 202341084539-DRAWINGS [11-12-2023(online)].pdf | 2023-12-11 |
| 9 | 202341084539-DECLARATION OF INVENTORSHIP (FORM 5) [11-12-2023(online)].pdf | 2023-12-11 |
| 10 | 202341084539-Proof of Right [11-03-2024(online)].pdf | 2024-03-11 |
| 11 | 202341084539-FORM-26 [11-03-2024(online)].pdf | 2024-03-11 |
| 12 | 202341084539-DRAWING [09-12-2024(online)].pdf | 2024-12-09 |
| 13 | 202341084539-CORRESPONDENCE-OTHERS [09-12-2024(online)].pdf | 2024-12-09 |
| 14 | 202341084539-COMPLETE SPECIFICATION [09-12-2024(online)].pdf | 2024-12-09 |
| 15 | 202341084539-STARTUP [10-12-2024(online)].pdf | 2024-12-10 |
| 16 | 202341084539-FORM28 [10-12-2024(online)].pdf | 2024-12-10 |
| 17 | 202341084539-FORM-9 [10-12-2024(online)].pdf | 2024-12-10 |
| 18 | 202341084539-FORM 18A [10-12-2024(online)].pdf | 2024-12-10 |
| 19 | 202341084539-Power of Attorney [01-01-2025(online)].pdf | 2025-01-01 |
| 20 | 202341084539-FORM28 [01-01-2025(online)].pdf | 2025-01-01 |
| 21 | 202341084539-Form 1 (Submitted on date of filing) [01-01-2025(online)].pdf | 2025-01-01 |
| 22 | 202341084539-Covering Letter [01-01-2025(online)].pdf | 2025-01-01 |
| 23 | 202341084539-CERTIFIED COPIES TRANSMISSION TO IB [01-01-2025(online)].pdf | 2025-01-01 |
| 24 | 202341084539-FER.pdf | 2025-01-24 |
| 25 | 202341084539-FORM 3 [23-04-2025(online)].pdf | 2025-04-23 |
| 26 | 202341084539-OTHERS [15-05-2025(online)].pdf | 2025-05-15 |
| 27 | 202341084539-MARKED COPY [15-05-2025(online)].pdf | 2025-05-15 |
| 28 | 202341084539-FORM-8 [15-05-2025(online)].pdf | 2025-05-15 |
| 29 | 202341084539-FER_SER_REPLY [15-05-2025(online)].pdf | 2025-05-15 |
| 30 | 202341084539-CORRECTED PAGES [15-05-2025(online)].pdf | 2025-05-15 |
| 31 | 202341084539-CLAIMS [15-05-2025(online)].pdf | 2025-05-15 |
| 32 | 202341084539-US(14)-HearingNotice-(HearingDate-26-09-2025).pdf | 2025-09-10 |
| 33 | 202341084539-Correspondence to notify the Controller [23-09-2025(online)].pdf | 2025-09-23 |
| 34 | 202341084539-FORM-26 [25-09-2025(online)].pdf | 2025-09-25 |
| 35 | 202341084539-Written submissions and relevant documents [13-10-2025(online)].pdf | 2025-10-13 |
| 36 | 202341084539-FORM-8 [13-10-2025(online)].pdf | 2025-10-13 |
| 37 | 202341084539-FORM FOR STARTUP [13-10-2025(online)].pdf | 2025-10-13 |
| 38 | 202341084539-EVIDENCE FOR REGISTRATION UNDER SSI [13-10-2025(online)].pdf | 2025-10-13 |
| 39 | 202341084539-Response to office action [24-10-2025(online)].pdf | 2025-10-24 |
| 40 | 202341084539-PatentCertificate27-10-2025.pdf | 2025-10-27 |
| 41 | 202341084539-IntimationOfGrant27-10-2025.pdf | 2025-10-27 |
| 1 | 202341084539_SearchStrategyNew_E_SearchHistory(13)E_23-01-2025.pdf |
| 2 | 202341084539_SearchStrategyAmended_E_search2AE_27-08-2025.pdf |