Sign In to Follow Application
View All Documents & Correspondence

Autonomous Laboratory Monitoring Robot And Method Thereof

Abstract: Usually, in laboratories where major experiments happen, instruments need frequent (or constant) monitoring. However, there is no centralized system used for monitoring instruments. This disclosure relates to a method of monitoring one or more equipment at a laboratory station. The autonomous laboratory monitoring robot is navigated on differential drive mobile platform at each section of the laboratory station. Image capturing unit of the autonomous laboratory monitoring robot is aligned by two degree of freedom telescopic arm at required angle to focus display screen of equipment at the laboratory station. A trained model is generated based on data obtained from images of the equipment. Object associated with the equipment is detected based on the trained model. The equipment performing technical activities are monitored at a predefined time interval in the laboratory station. Nearest docking station is dynamically detected to navigate the autonomous laboratory monitoring robot after assigned scan is performed.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
23 June 2023
Publication Number
52/2024
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
Parent Application

Applicants

Tata Consultancy Services Limited
Nirmal Building, 9th Floor, Nariman Point, Mumbai 400021, Maharashtra, India

Inventors

1. KULANGARA MURIYIL, Robin Tommy
Tata Consultancy Services Limited, Electronics Technology Park SEZ - 1, Technopark Campus, Trivandrum, Thiruvananthapuram – 695581, Kerala, India
2. SAXENA, Amit
Tata Consultancy Services Limited, Plot No. 362-363, Phase-IV, Udyog Vihar, Gurgaon – 122016, Haryana, India
3. RAVINDRANATHAN, Reshmi
Tata Consultancy Services Limited, Electronics Technology Park SEZ - 1, Technopark Campus, Trivandrum, Thiruvananthapuram – 695581, Kerala, India
4. JOHNY, Georgekutty
Tata Consultancy Services Limited, Electronics Technology Park SEZ - 1, Technopark Campus, Trivandrum, Thiruvananthapuram – 695581, Kerala, India
5. NAGPAL, Divya Arora
Tata Consultancy Services Limited, 4 & 5th floor, PTI Building, No 4, Sansad Marg, New Delhi – 110001, Delhi, India
6. BABU, Amal
Tata Consultancy Services Limited, Electronics Technology Park SEZ - 1, Technopark Campus, Trivandrum, Thiruvananthapuram – 695581, Kerala, India
7. SIVAN, Vishnu
Tata Consultancy Services Limited, Electronics Technology Park SEZ - 1, Technopark Campus, Trivandrum, Thiruvananthapuram – 695581, Kerala, India
8. VARMA, Binuja
Tata Consultancy Services Limited, Plot No. A-44 & A-45, Ground, 1st to 5th Floor & 10th floor, Block - C & D, Sector - 62, Noida – 201309, Uttar Pradesh, India

Specification

Description:FORM 2

THE PATENTS ACT, 1970
(39 of 1970)
&
THE PATENT RULES, 2003

COMPLETE SPECIFICATION
(See Section 10 and Rule 13)

Title of invention:
AUTONOMOUS LABORATORY MONITORING ROBOT AND METHOD THEREOF

Applicant:
Tata Consultancy Services Limited
A company Incorporated in India under the Companies Act, 1956
Having address:
Nirmal Building, 9th Floor,
Nariman Point, Mumbai 400021,
Maharashtra, India

The following specification particularly describes the invention and the manner in which it is to be performed.
TECHNICAL FIELD
[001] The disclosure herein generally relates to a laboratory monitoring system, and, more particularly, to an autonomous laboratory monitoring robot and method thereof.

BACKGROUND
[002] Usually, in laboratories where major experiments happen, it is necessary that the instruments need frequent or continuous monitoring. Either it would involve the presence of a user such as scientist/laboratory attendant or needs to be performed using external systems. The experiments might take anywhere between 48 - 72 hours to complete approximately. The user needs to stay in the lab for the course of this time, which may not be practically possible to be always present in the lab. As the lab is a highly secure environment, no data or assets can be taken out of the lab. The valuable time of the scientists is also spent in observing the instruments while the experiments are in process. The scientists spend about 70 percent of their time observing the instrument manually. If any failure/ error occurs when the scientist is not around, he may get to know only when he is back near the instrument. Also, there is no way to understand exactly when the error occurred, what ambient conditions caused the error etc. The lab environment should also be kept contamination free. Frequent visits by personnel can increase the contamination. Also, proper conditions need to be maintained inside the room and near the instruments and these conditions need to be monitored. There is no centralized system to implement this. On the other hand, monitoring the instrument performance and the lab environment during such long experiments is crucial. Alternatively, automated lab monitoring robots were used. Usually for monitoring purposes, dependency is on camera and vision-based solutions. But the solutions may not be viable as security needs to be ensured. And it is not possible to have cameras that could monitor the operations as well as data from the equipment in a real time manner. If any camera/surveillance systems to monitor instruments are installed individually for each system, managing them, and making any changes in them becomes extremely difficult. The privacy of lab environment and the people in the lab are also compromised if video cameras are used for monitoring instruments.

SUMMARY
[003] Embodiments of the present disclosure present technological improvements as solutions to one or more of the above-mentioned technical problems recognized by the inventors in conventional systems. For example, in one embodiment, there is provided an autonomous laboratory monitoring robot for monitoring one or more equipment at one or more laboratory stations. The autonomous laboratory monitoring robot includes a memory storing instructions; one or more communication interfaces; and one or more hardware processors coupled to the memory via the one or more communication interfaces, wherein the one or more hardware processors are configured by the instructions to: navigate, on a differential drive mobile platform, the autonomous laboratory monitoring robot at each section of the one or more laboratory stations; align, by a two degree of freedom (DOF) telescopic arm, an image capturing unit of the autonomous laboratory monitoring robot at a required angle to focus on a display screen of the one or more equipment at the one or more laboratory stations; obtain, by the image capturing unit, one or more images of the one or more equipment to extract data; generate, by a convolutional neural network, a trained model based on the data associated with the one or more equipment; detect, one or more objects associated with the one or more equipment based on the trained model; monitor, by the autonomous laboratory monitoring robot, the one or more equipment performing one or more technical activities at a predefined time interval in the one or more laboratory stations; and dynamically detect, one or more docking stations which are located nearer to navigate the autonomous laboratory monitoring robot after an assigned scan is performed. The one or more objects pertain to the display screen of the one or more equipment. The docking station which is located nearer is detected based on one of (a) current position of the autonomous laboratory monitoring robot, or (b) an overall layout of operational path, or (c) a location and a distance to be traveled to reach each docking station, and combination thereof.
[004] In another aspect, a processor implemented method of monitoring one or more equipment at one or more laboratory stations by an autonomous laboratory monitoring robot is provided. The processor implemented method includes at least one of: navigating, via one or more hardware processors, the autonomous laboratory monitoring robot on a differential drive mobile platform at each section of the one or more laboratory stations; aligning, by a two degree of freedom (DOF) telescopic arm, an image capturing unit of the autonomous laboratory monitoring robot at a required angle to focus on a display screen of the one or more equipment at the one or more laboratory stations; obtaining, by the image capturing unit, one or more images of the one or more equipment to extract data; generating, by a convolutional neural network, a trained model based on the data associated with the one or more equipment; detecting, via the one or more hardware processors, one or more objects associated with the one or more equipment based on the trained model; monitoring, by the autonomous laboratory monitoring robot, the one or more equipment performing one or more technical activities at a predefined time interval in the one or more laboratory stations; and dynamically detecting, via the one or more hardware processors, one or more docking stations to navigate the autonomous laboratory monitoring robot after an assigned scan is performed. The one or more objects pertain to the display screen of the one or more equipment. The docking station which is located nearer is detected based on one of (a) current position of the autonomous laboratory monitoring robot, or (b) an overall layout of operational path, or (c) a location and a distance to be traveled to reach each docking station, and combination thereof.
[005] In yet another aspect, there are provided one or more non-transitory machine readable information storage mediums comprising one or more instructions which when executed by one or more hardware processors causes at least one of: navigating, the autonomous laboratory monitoring robot on a differential drive mobile platform at each section of the one or more laboratory stations; aligning, by a two degree of freedom (DOF) telescopic arm, an image capturing unit of the autonomous laboratory monitoring robot at a required angle to focus on a display screen of the one or more equipment at the one or more laboratory stations; obtaining, by the image capturing unit, one or more images of the one or more equipment to extract data; generating, by a convolutional neural network, a trained model based on the data associated with the one or more equipment; detecting, one or more objects associated with the one or more equipment based on the trained model; monitoring, by the autonomous laboratory monitoring robot, the one or more equipment performing one or more technical activities at a predefined time interval in the one or more laboratory stations; and dynamically detecting, one or more docking stations which are located nearer to navigate the autonomous laboratory monitoring robot after an assigned scan is performed. The one or more objects pertain to the display screen of the one or more equipment. The docking station which is located nearer is detected based on one of (a) current position of the autonomous laboratory monitoring robot, or (b) an overall layout of operational path, or (c) a location and a distance to be traveled to reach each docking station, and combination thereof.
[006] It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the invention, as claimed.

BRIEF DESCRIPTION OF THE DRAWINGS
[007] The accompanying drawings, which are incorporated in and constitute a part of this disclosure, illustrate exemplary embodiments and, together with the description, serve to explain the disclosed principles:
[008] FIG. 1 illustrates a system to remotely monitor one or more laboratory stations by interacting with an autonomous laboratory monitoring robot, according to an embodiment of the present disclosure.
[009] FIG. 2 is a functional block diagram illustrating an exemplary system interacting with the autonomous laboratory monitoring robot to remotely monitor the one or more laboratory stations, according to an embodiment of the present disclosure.
[010] FIG. 3A - FIG. 3B are an isometric view and a top view of the autonomous laboratory monitoring robot respectively, according to an embodiment of the present disclosure.
[011] FIG. 4 is an exemplary flow diagram illustrating a method of automatic adjusting an image capturing unit of the autonomous laboratory monitoring robot for telescopic mechanism, according to an embodiment of the present disclosure.
[012] FIG. 5 is an exemplary flow diagram illustrating a method of detecting one or more objects from one or more captured images by the autonomous laboratory monitoring robot, according to an embodiment of the present disclosure.
[013] FIG. 6A and FIG. 6B are exemplary flow diagrams illustrating a method of monitoring one or more equipment at the one or more laboratory stations by the autonomous laboratory monitoring robot, according to an embodiment of the present disclosure.

DETAILED DESCRIPTION OF EMBODIMENTS
[014] Exemplary embodiments are described with reference to the accompanying drawings. In the figures, the left-most digit(s) of a reference number identifies the figure in which the reference number first appears. Wherever convenient, the same reference numbers are used throughout the drawings to refer to the same or like parts. While examples and features of disclosed principles are described herein, modifications, adaptations, and other implementations are possible without departing from the scope of the disclosed embodiments.
[015] There is a need for automated systems to remotely monitor one or more laboratory stations. Embodiments of the present disclosure provide an autonomous laboratory monitoring robot to remotely monitor and surveillance one or more laboratory stations and environments around the one or more laboratory stations. The autonomous laboratory monitoring robot is a flexible robotic ecosystem for repetitive lab surveillance activities which is ideal for continuous and intermittent monitoring of one or more equipment at the one or more laboratory stations. The autonomous laboratory monitoring robot which includes a mobile base with a two degree of freedom (DOF) telescopic arm for adjusting height and horizontal angle to capture the images of the equipment for indoor applications. From the captured images, display panels of the one or more equipment are detected using an artificial intelligence technique i.e., a convolutional neural network (CNN). Necessary data is extracted and uploaded for further perusal by users. The users are referred to but not limited to scientists, and lab technicians/attendants, etc. Various collaborative and safety features are included to ensure seamless integration in a regular operating environment with people and other objects. For example, the collaborative features which includes (a) a mirror mechanism for adjusting the laboratory equipment display tilt without interference to the laboratory and robot ecosystem, and (b) live streaming feature is accessible to users with a web interface. Parameters associated with the one or more laboratory stations are also constantly monitored using in-built sensors of the autonomous laboratory monitoring robot.
[016] The users can remotely schedule scanning activities for the autonomous laboratory monitoring robot through a user interface. The autonomous laboratory monitoring robot can be navigated in indoor environment through a guided line track. The autonomous laboratory monitoring robot detects one or more junction crossings and dynamically adjusts corresponding speed based on radio frequency sensors placed on the floor. The autonomous laboratory monitoring robot detects and controls when there are obstacles or people blocking the path and continue the navigation once the obstacles are removed. The autonomous laboratory monitoring robot can autonomously move to a corresponding docking station when not in operation, or if a power supply (e.g., battery) becomes critically low during operation.
[017] Referring now to the drawings, and more particularly to FIG. 1 through 6B, where similar reference characters denote corresponding features consistently throughout the figures, there are shown preferred embodiments and these embodiments are described in the context of the following exemplary system and/or method.
[018] Reference numerals of one or more components of an autonomous laboratory monitoring robot for remotely monitoring one or more laboratory stations as depicted in the FIG. 1 through FIG. 3B are provided in Table 1 below for ease of description.
S.NO NAME OF COMPONENT REFERENCE NUMERALS
1 Autonomous laboratory monitoring robot 214
2 Image capturing unit 302A-B
3 Adjustable gripping unit 304
4 Two degree of freedom (DOF) telescopic arm 306
5 differential drive mobile platform 308
6 Adjustable wheel 310
7 Charging contacts 312A-B
TABLE 1
[019] FIG. 1 illustrates a system 100 to remotely monitor one or more laboratory stations 212A-N by interacting with the autonomous laboratory monitoring robot 214, according to an embodiment of the present disclosure. In an embodiment, the system 100 includes one or more processor(s) 102, communication interface device(s) or input/output (I/O) interface(s) 106, and one or more data storage devices or memory 104 operatively coupled to the one or more processors 102. The memory 104 includes a database. The one or more processor(s) processor 102, the memory 104, and the I/O interface(s) 106 may be coupled by a system bus such as a system bus 108 or a similar mechanism. The one or more processor(s) 102 that are hardware processors can be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, state machines, logic circuitries, and/or any devices that manipulate signals based on operational instructions. Among other capabilities, the one or more processor(s) 102 is configured to fetch and execute computer-readable instructions stored in the memory 104. In an embodiment, the system 100 can be implemented in a variety of computing systems, such as laptop computers, notebooks, hand-held devices, workstations, mainframe computers, servers, a network cloud, and the like.
[020] The I/O interface device(s) 106 can include a variety of software and hardware interfaces, for example, a web interface, a graphical user interface, and the like. The I/O interface device(s) 106 may include a variety of software and hardware interfaces, for example, interfaces for peripheral device(s), such as a keyboard, a mouse, an external memory, a camera device, and a printer. Further, the I/O interface device(s) 106 may enable the system 100 to communicate with other devices, such as web servers and external databases. The I/O interface device(s) 106 can facilitate multiple communications within a wide variety of networks and protocol types, including wired networks, for example, local area network (LAN), cable, etc., and wireless networks, such as Wireless LAN (WLAN), cellular, or satellite. In an embodiment, the I/O interface device(s) 106 can include one or more ports for connecting a number of devices to one another or to another server. The network may either be a dedicated network or a shared network. The shared network represents an association of the different types of networks that use a variety of protocols, for example, Hypertext Transfer Protocol (HTTP), Transmission Control Protocol/Internet Protocol (TCP/IP), Wireless Application Protocol (WAP), and the like, to communicate with one another. Further, the network may include a variety of network devices, including routers, bridges, servers, computing devices, storage devices, and the like.
[021] The memory 104 may include any computer-readable medium known in the art including, for example, volatile memory, such as static random-access memory (SRAM) and dynamic random-access memory (DRAM), and/or non-volatile memory, such as read only memory (ROM), erasable programmable ROM, flash memories, hard disks, optical disks, and magnetic tapes. In an embodiment, the memory 104 includes a plurality of modules 110 and a repository 112 for storing data processed, received, and generated by the plurality of modules 110. The plurality of modules 110 may include routines, programs, objects, components, data structures, and so on, which perform particular tasks or implement particular abstract data types.
[022] Further, the database stores information pertaining to inputs fed to the system 100 and/or outputs generated by the system (e.g., data/output generated at each stage of the data processing) 100, specific to the methodology described herein. More specifically, the database stores information being processed at each step of the proposed methodology.
[023] Additionally, the plurality of modules 110 may include programs or coded instructions that supplement applications and functions of the system 100. The repository 112, amongst other things, includes a system database 114 and other data 116. The other data 116 may include data generated as a result of the execution of one or more modules in the plurality of modules 110. Further, the database stores information pertaining to inputs fed to the system 100 and/or outputs generated by the system (e.g., at each stage), specific to the methodology described herein. Herein, the memory for example the memory 104, and the computer program code configured to, with the hardware processor for example the processor 102, causes the system 100 to perform various functions described herein under.
[024] FIG. 2 is a functional block diagram illustrating an exemplary system 200 interacting with the autonomous laboratory monitoring robot 214 to remotely monitor the one or more laboratory stations 212A-N, according to an embodiment of the present disclosure. FIG. 3A - FIG. 3B are an isometric view and a top view of the autonomous laboratory monitoring robot 214 respectively, according to an embodiment of the present disclosure. FIG. 4 is an exemplary flow diagram illustrating a method of automatic adjusting the image capturing unit 302A-B of the autonomous laboratory monitoring robot 214 for telescopic mechanism, according to an embodiment of the present disclosure. FIG. 5 is an exemplary flow diagram illustrating a method of detecting objects from one or more captured images by the autonomous laboratory monitoring robot 214, according to an embodiment of the present disclosure. The system 200 may be an example of the system 100 (FIG. 1). In an example embodiment, the system 200 may be embodied in, or is in direct communication with the system, for example the system 100 (FIG. 1). The system 200 includes a cloud 202, a web interface 204, a user device 206, one or more docking stations 208A-B, a radio frequency identification (RFID) 210A-N, the one or more laboratory stations 212A-N, and the autonomous laboratory monitoring robot 214. Herein, the system 200 may acquire an input data by interacting with the autonomous laboratory monitoring robot 214, to remotely monitor the one or more laboratory stations 212A-N. For example, one or more environment parameters are captured and decision like temperature of the environment is achieved and making sure the environment temperature is maintained according to the laboratory equipment rating. In an embodiment, the system 200 interface with one or more user devices, collectively referred to as the user device hereinafter. In an embodiment, the user device may be embodied in handheld electronic device, a mobile phone, a smartphone, a portable computer, a PDA, and so on. The user device is communicatively coupled to the system 200 through a network and may be capable of providing input data to the cloud 202. In an embodiment, the user device includes an application to capture one or more images, one or more features from the one or more images are extracted and are shared with the cloud 202.
[025] The autonomous laboratory monitoring robot 214 consists of the image capturing unit 302A-B, the adjustable gripping unit 304, the two degree of freedom (DOF) telescopic arm 306, the differential drive mobile platform 308, the adjustable wheel 310, and the one or more charging contacts 312A-B. The autonomous laboratory monitoring robot 214 navigates along a line which are marked on the floor e.g., a guided line track. The autonomous laboratory monitoring robot 214 can move forward and backward along the line. The autonomous laboratory monitoring robot 214 detects junction crossings and dynamically adjust speed based on radio frequency sensors placed on the floor of the one or more laboratory stations 212A-N. The autonomous laboratory monitoring robot 214 is designed to detect obstacles, differentiate between various instruments and articles inside the laboratory. The autonomous laboratory monitoring robot 214 can identify about one or more classes of objects (e.g., forty types of classes like instruments, heating, ventilation, and air conditioning (HVAC's), furniture etc.). The autonomous laboratory monitoring robot 214 identifies the instruments among all the other approaches objects. The autonomous laboratory monitoring robot 214 is also highly contextual and can transform a mode of operation based on the context of the environment. The autonomous laboratory monitoring robot 214 understands and recognizes context and does contextual decision making. For example, the autonomous laboratory monitoring robot 214 utilizes multiple techniques (e.g., Ultra-wideband (UWB), Near-field communication (NFC), Radio Frequency Identification (RFID), and Bluetooth Low Energy (BLE)) to identify one or more assets or one or more articles in inventory. The autonomous laboratory monitoring robot 214 can do occasional runs in warehouse and keep register of the inventory updated. The autonomous laboratory monitoring robot 214 can report missing items and anomalies in the inventory. The autonomous laboratory monitoring robot 214 can also provide alerts on misplaced items or missing items.
[026] The autonomous laboratory monitoring robot 214 can differentiate between different kinds of obstacles and instruments based on number of different parameters: (a) data from one or more navigational sensors i.e., the data from the navigational sensors is plotted on a confidence probability graph (CPG). The CPG provides insight into shape of any object in the vicinity of the sensors, (b) proximity to the UWB, the NFC, the RFID, the BLE markers i.e., the instruments in the laboratory are tagged using one or more markers, (c) current location of the autonomous laboratory monitoring robot 214, and (d) data from camera feed. In an embodiment, the proximity can be identified using one or more readers in the autonomous laboratory monitoring robot 214. In an embodiment, the camera of the autonomous laboratory monitoring robot 214 is activated if the location is in the vicinity of any obstacle/ near an instrument. For example, the obstacle can be person/human, random obstacle - chair/table etc., and another instruments. In an embodiment, radio frequency navigation techniques are utilized (e.g., a radio frequency identifier (RFID) 210A-N) placed on floor of the one or more laboratory stations 212A-N for dynamically adjusting the speed at junctions and at each station. Based on the RFID information, the autonomous laboratory monitoring robot 214 in the laboratory environment is performed. For example, orientation of the autonomous laboratory monitoring robot 214 is adjusted with respect to the navigation path, if RFID detected corresponds to docking approach ID. The web interface 204 includes user authentication. The admin can control the autonomous laboratory monitoring robot 214 manually as well as scheduled scan timings. Normal users can see live status of one or more sensors of the autonomous laboratory monitoring robot 214 and can have access to the live stream from the smartphone. In an embodiment, only authorized users can use the web interface 204 for the autonomous laboratory monitoring robot 214. The admin can schedule a navigation scanning cycle with the web interface 204. The admin can do a scan which is not scheduled based on whether the previous scan timings and upcoming scan timings. The admin can also manually control the autonomous laboratory monitoring robot 214 with the web page user controls and can monitor the sensor and actuator status of the autonomous laboratory monitoring robot 214 in real time.
[027] The autonomous laboratory monitoring robot 214 includes (a) a mobile base i.e., the differential drive mobile platform 308 for navigating in an indoor environment (e.g., the one or more laboratory stations 212A-N), (b) the two degree of freedom (DOF) telescopic arm 306 for focusing the image capturing unit 302A-B (e.g., front camera and rear camera) on a display panel of the equipment at each laboratory station according to height and position of the instrument, and (c) a docking station for charging the autonomous laboratory monitoring robot 214. The differential drive mobile platform 308 consisting of geared direct current (DC) motors with an encoder for precise movement, powered by a robotic operating system (ROS framework) to establish communication between multiple sensors and actuators in hardware of the autonomous laboratory monitoring robot 214. The adjustable gripping unit 304 is configured to holding the image capturing unit 302A-B. The two degree of freedom (DOF) telescopic arm 306 can adjust the height and a horizontal angle to capture the images of the instrument for indoor applications. In an embodiment, the two degree of freedom (DOF) telescopic arm 306 moves and adjusts vertical height and horizontal camera pan controls. For example, the two degree of freedom (DOF) telescopic arm 306 adjust vertical height adjustment and 180 degree rotation covering a total field of vision of 360 horizontally along a height of 120 cm to 180 cm from ground level and the mechanism takes two minutes and forty six seconds to rise from lowest position to highest position. The autonomous laboratory monitoring robot 214 navigates each part of the one or more laboratory stations 212A-N by the differential drive mobile platform 308. In an embodiment, one or more equipment corresponds to one or more instruments to perform different kinds of activity at the one or more laboratory stations 212A-N e.g., but is not limited to analysis of fluids, chemicals, scientific experimentation etc. In an embodiment, the one or more equipment are alternatively referred to as one or more instruments. The adjustable wheel 310 is configured to adjust height for levelling the autonomous laboratory monitoring robot 214.
[028] The autonomous laboratory monitoring robot 214 corrects corresponding orientation automatically to align the camera at the required angle at each laboratory station to capture one or more images of each instrument. The camera can be aligned parallel to the autonomous laboratory monitoring robot 214 in two ways. Base of the autonomous laboratory monitoring robot 214 can correct corresponding orientation with a support of sensor fusion techniques, in one setup scenario. Another setup scenario includes correction of a horizontal telescopic actuator to align the camera of the autonomous laboratory monitoring robot 214 with the instrument display panel at each laboratory station. In an embodiment, the autonomous laboratory monitoring robot 214 can adjust corresponding height (e.g., 3.3ft to 5.6ft) to observe any kind of instrument that can be observed by a human. the autonomous laboratory monitoring robot 214 includes additional degrees of freedom to adjust corresponding angle and tilt so that instrument displays in any angle can be properly recognized, observed, and analyzed. The autonomous laboratory monitoring robot 214 interacts with any kind of equipment in the one or more laboratory stations 212A-N. The autonomous laboratory monitoring robot 214 also include a manipulator (e.g., a robotic arm) which can interact with each instrument interfaces in a three-dimensional (3D) space, by recognizing the information on the interface. An admin can train the autonomous laboratory monitoring robot 100 to respond to prompts that require an input from the user (e.g., an OK, cancel, and continue etc.). The autonomous laboratory monitoring robot 214 can touch or interact with the display panel of each instrument to provide the inputs using corresponding manipulator arm, thus augmenting a scientist. For example, if the autonomous laboratory monitoring robot 214 finds any item in the instrument interface that requires any manual action, a decision is taken by the autonomous laboratory monitoring robot 214. Alternatively, the scientist is alerted about the event. The autonomous laboratory monitoring robot 214 logs all the interactions with the machine to be used for any future reference or verification by the scientists and the log is available to the scientists in a user-friendly presentation. The display panel of each instrument may not always be perfectly vertical and parallel to the camera of the autonomous laboratory monitoring robot 214. When the autonomous laboratory monitoring robot 214 reaches a new instrument for the first time, a ‘quick seek’ operation is performed, i.e., the camera does a fast scan to detect even tiniest part of the equipment display. The two degree of freedom (DOF) telescopic arm 306 traverses vertically, and at the same time the camera scans horizontally by rotating about corresponding axis. During the quick seek operation some parts of the equipment display are detected. If not, the quick seek is performed in a different manner until some part of the equipment display is detected.
[029] The autonomous laboratory monitoring robot 214 distinguishes equipment displays from other printed or embossed objects on the instruments by image-based AI algorithm comprised in the memory 104 and invoked for execution. The autonomous laboratory monitoring robot 214 is trained with thousands of images of equipment displays to identify the difference. Once some part of the equipment display is detected, the autonomous laboratory monitoring robot 214 proceeds to do a precision seek operation (with an algorithm as known in the art and comprised in the memory and invoked (for execution of the method described herein)) that estimates direction to which the telescopic mechanism, and to move, for the entire instrument display to be visible based on the images taken continuously by the camera of the autonomous laboratory monitoring robot 214. The autonomous laboratory monitoring robot 214 can adjust corresponding telescopic mechanism i.e., which has height, tilt, and angle adjustment, and move sideways using the mobile base until the entire instrument display is visible. The autonomous laboratory monitoring robot 214 can also interact with instrument displays using the manipulator the robotic arm. For example, some instruments for which some prompt or an input is required from the operator/scientist after an experiment is completed, to view the results of the experiment. The autonomous laboratory monitoring robot 214 can be trained and authorized to interact with the instrument console to display results on the screen for the operator/scientist.
[030] An equipment interface is detected using a video processing from the camera feed. If the interface is not detected, the autonomous laboratory monitoring robot 214 adjusts using corresponding anticipation of where the equipment screen would be, based on familiarity with such machines. If the interface is visible, but not entirely, adjustment based on the position of the visible portion of the screen in the video or photo captured. The adjustments are performed using a mechatronic system which consists of highly accurate stepper motors, screw rods, newly designed and three dimensional (3D) printed assembly parts, proximity, and distance sensors etc. The corrections are made so that the detected interface screen is correctly aligned at the center of the image frame.
[031] The telescopic mechanism automatically adjusts corresponding height and pan angle until the equipment display is correctly detected in the camera frame. Initially when the autonomous laboratory monitoring robot 214 reaches a particular station where an instrument is placed the automatic camera adjustment process starts. Initially a check is performed whether the display screen of the instrument is detected or not by the camera. If the display screen is not detected, the telescopic mechanism moves to corresponding home position from the current position and then move to the telescopic top position until the display screen is detected by the camera. When the camera detects the display screen, the telescopic mechanism is stopped. Then the telescopic mechanism finetunes corresponding height and pan angles until the detected display screen are correctly aligned at the center of the image frame.
[032] The image capturing unit 302A-B of the autonomous laboratory monitoring robot 214 obtains one or more images to extract data associated with the one or more equipment. The data associated with the one or more equipment corresponds to: (a) position, (b) angle, and (c) height of the display panel of the one or more equipment. A trained model is generated by but not limited to a convolutional neural network (CNN) based on the data associated with the one or more equipment. The model is trained on the laboratory equipment's available in the laboratory. For example, considering a deoxyribonucleic acid (DNA) sequencer machine which includes a screen. Various images of the DNA sequencer machines are captured at various angles and height at which the images are fed to create augmentation and transformation for the CNN to create an artificial intelligence (AI) model which can identify at least one of: the position of the screen of the machine, the position of data on the machine, information displayed on the screen, white and dark mode, various types of alerts and even the audio signal produced by the device is also trained on the AI model to identify the information related to the audio data.
[033] In an embodiment, multiple AI models may be trained on multi modal like image, video, and audio of the equipment's available in the laboratory. In an embodiment, based on the current information of the one or more equipment, the autonomous laboratory monitoring robot 214 decides on when to check for further information. For example, if the DNA sequencer information says the process of extraction is started and the system must wait of another five hours to complete the process then the autonomous laboratory monitoring robot 214 automatically identifies and check back in the predefined period based on the next stage initiation followed by listening to the multi modal outcomes from the equipment.
[034] The display panel of the equipment and/or the instrument at the one or more laboratory stations 212A-N are detected from the one or more images captured using the autonomous laboratory monitoring robot 214. A data set of images is created with the instrument images and annotations for the images are created. For example, 200 images are considered for different instruments. From these images 70 percent of the input data sets are used for training and the remaining 30 percent are used for validation. In an embodiment, a transfer learning is utilized for detecting screens from the input image. The input image includes entire instrument with background. The screens of the instrument are tilted in the captured images based on perspective of the actual environment of the laboratory equipment. The algorithm identifies the screen of the equipment in the image, draws a bounding box over and crops the content inside the bounding box. In an embodiment, perspective transformation and de-skewing is performed on the captured image according to a type of the equipment. The data is extracted from the captured images. Based on the data to be extracted from the image i.e., which is either text data or numerical data, specific image to string or image to data functions are used. The data extracted from the screen is then stored in the database and/or to trigger necessary actions is required.
[035] The autonomous laboratory monitoring robot 214 receives a schedule for scanning operation through the user interface. The user can select combination of time and stations to initiate the scan during the selected time, and the autonomous laboratory monitoring robot 214 is configured to scan at selected laboratory stations. The autonomous laboratory monitoring robot 214 can initiate travel to each equipment for: (a) an emergency scan due to an alert sound by any of the equipment, (b) an experiment is about to be completed in any of the equipment, and (c) the scientist needs to look at any equipment at any point in time.
[036] The autonomous laboratory monitoring robot 214 is trained with the sounds of each instrument to identify the instrument from which any alarms or alert sounds originate. In case there is no sounds generated by instruments, the autonomous laboratory monitoring robot 214 may consider other type of inputs such as pattern/signal, color codes associated with the instrument, and the like that can enable the autonomous laboratory monitoring robot 214 identify that specific instrument. Once the instrument is identified by ‘always active’ monitoring system of the autonomous laboratory monitoring robot 214 i.e., which monitors environmental parameters and audible alerts, the drive system of the autonomous laboratory monitoring robot 214 is enabled which begins to travel to the instrument causing the sound. If the instrument is not identifiable by the sound, the autonomous laboratory monitoring robot 214 can identify the direction from which the sound originates, using the signals from inbuilt microphone array around the autonomous laboratory monitoring robot 214 and performing triangulation calculations, also algorithms that identifies direction based on relative timing and a doppler effect.
[037] The autonomous laboratory monitoring robot 214 keeps a countdown timer for each instrument and consequently knows when an experiment is about to be completed. When the autonomous laboratory monitoring robot 214 visits any instrument during an experiment, takes note of the remaining time and keeps track. When the experiment in any instrument is about to be completed, the autonomous laboratory monitoring robot 214 arrives near the instrument a few minutes early and observes the last few minutes and finally captures the results. In an embodiment, the scientist can ask the autonomous laboratory monitoring robot 214 to a particular station at a specific time or immediately upon request.
[038] The autonomous laboratory monitoring robot 214 returns to the one or more docking stations 208 A-B once the scan is completed. Apart from the scheduled scan the user can also select an unscheduled scan. The autonomous laboratory monitoring robot 214 consider intelligence to initiate the unscheduled scan based on the previous scheduled scan, upcoming scheduled scans, and a status of a power supply. For example, the power supply corresponds to a battery. In an embodiment, the power supply may be alternatively referred to as the battery. The autonomous laboratory monitoring robot 214 also allows the user to include the one or more laboratory stations 212A-N of the upcoming scan to be included in the current unscheduled scan. For example, when a request for an unscheduled scan is received, the autonomous laboratory monitoring robot 214 estimates amount of the battery needed to complete the requested scan. The estimation is performed on basis of knowledge of the autonomous laboratory monitoring robot 214 regarding the locations of each instrument and distance required to travel. Once this estimation is completed, the autonomous laboratory monitoring robot 214 compares with the remaining charge in the battery and the time of the next scheduled scan. The autonomous laboratory monitoring robot 214 checks if the battery level is required to complete the requested scan, and to charge itself back to the required battery percentage before the next scheduled scan. If any of these conditions cannot be satisfied, an appropriate message is provided to the user. If the current battery level is low, the autonomous laboratory monitoring robot 214 informs the user the time at which the autonomous laboratory monitoring robot 214 have enough battery for the requested unscheduled scan. The autonomous laboratory monitoring robot 214 may also be trained with a dataset of patterns including activities and/or scans to be performed, activities/scans that are performed previously which serve as learning feedback to the autonomous laboratory monitoring robot 214. The feedback may be used by the autonomous laboratory monitoring robot 214 to automatically perform scanning of the environment or monitoring of equipment even in the user’s absence. This intelligence of the autonomous laboratory monitoring robot 214 may avoid any anomalies. Such training of the autonomous laboratory monitoring robot 214 may either be performed by a user feeding a dataset or based on data accumulated by the autonomous laboratory monitoring robot 214 while performing scanning activities. This data accumulated may be either temporarily stored for training the autonomous laboratory monitoring robot 214 or may be stored in the memory of the autonomous laboratory monitoring robot 214 for further processing and analysis.
[039] The autonomous laboratory monitoring robot 214 can operate in an environment of one or more docking stations 208A-B. In an embodiment, the one or more docking stations 208A-B corresponds to one or more charging stations. The autonomous laboratory monitoring robot 214 can detect the nearest docking station and travel when the battery level range is between 20-30 percent. In an embodiment, if the there is an instrument to scan on the way to the docking station, the autonomous laboratory monitoring robot 214 decides whether there is enough charge to perform a scan on the route to the one or more docking stations 208A-B. There are dockers configured in front and back sides of the autonomous laboratory monitoring robot 214 to dock at any direction. The autonomous laboratory monitoring robot 214 includes an auto docking feature i.e., after performing an assigned scan (i.e., capturing information from a set of instruments or if the battery level of the robot becomes critically low, then the autonomous laboratory monitoring robot 214 can identify the nearest docking station and navigate. The nearest station is identified by the autonomous laboratory monitoring robot 214 based on but not limited to: (a) a current position, (b) overall layout of the operational path, (c) location of each laboratory station, (d) distance to be traveled to reach each station, and (e) location of docking stations.
[040] The autonomous laboratory monitoring robot 214 detects radio frequency range of approaching docking station automatically slow down and correct corresponding orientation with respect to the docking station based on the reading from one or more navigation sensors. An array of sound wave based, and light wave based navigational sensors constantly monitor the surroundings at the one or more laboratory stations 212A-N. The inputs from the arrays of sensors are combined to get the most accurate information using self-developed sensor fusion techniques. The self-developed sensor fusion technique is based on a graph referred to as a confidence probability graph (CPG). The CPG on each side of the autonomous laboratory monitoring robot 214 resulting in total of four CPG. Each CPG consider the input from all the sensors in a particular side and plots within the graph. The individual sensors are also assigned a confidence score based on the estimated reliability. The fused value is calculated using a formula which makes a calculation based on the confidence score for each sensor, number of overlapping data points gathered among various other factors for accurate data for navigation. After successfully correcting the orientation, the autonomous laboratory monitoring robot 214 slowly move towards the one or more docking stations 208A-B until the limit switch are activated or one or more charging contacts 312A-B is detected when charging points of the autonomous laboratory monitoring robot 214 meets the docking terminals. Then the autonomous laboratory monitoring robot 214 checks the charging current to validate if the robot is in charging mode. Once the autonomous laboratory monitoring robot 214 is in charging mode the scan cycle is completed. In an embodiment, one or more charging contacts 312A-B on the charging station have a spring-loaded mechanism (not shown in FIG) to make a gentle push for tight contact even after a contact is established. For example, a mechanical limit switch that is placed on the autonomous laboratory monitoring robot 214 in such a way that is pressed once the contact is made. A current sensor that is placed inside the autonomous laboratory monitoring robot 214 detects a current flow from the charger upon comes in contact.
[041] FIG. 6 is an exemplary flow diagram illustrating a method of monitoring one or more equipment at the one or more laboratory stations 212A-N by the autonomous laboratory monitoring robot 214, according to an embodiment of the present disclosure. In an embodiment, the system 100 comprises one or more data storage devices or the memory 104 operatively coupled to the one or more hardware processors 102 and is configured to store instructions for execution of steps of the method by the one or more hardware processors 102. The flow diagram depicted is better understood by way of following explanation/description. The steps of the method of the present disclosure will now be explained with reference to the components of the system as depicted in FIGS. 2.
[042] With reference to FIG. 1-FIG. 3B, at step 602, the autonomous laboratory monitoring robot 214 navigates on the differential drive mobile platform 308 at each section of the one or more laboratory stations 212A-N. With reference to FIG. 4, at step 604, the image capturing unit 302A-B of the autonomous laboratory monitoring robot 214 is aligned by the two degree of freedom (DOF) telescopic arm 306 at a required angle to focus on a display screen of the one or more equipment at the one or more laboratory stations 212A-N. The two degree of freedom (DOF) telescopic arm 306 traverses vertically, and the image capturing unit 302A-B scans horizontally by rotating about a corresponding axis to detect the display screen of the one or more equipment. At step 606, one or more images of the one or more equipment is obtained/captured by the image capturing unit 302A-B to extract data. The data includes but is not limited to: (a) a position, (b) a tilt angle, and (c) a height of the display screen of the one or more equipment. At step 608, a trained model is generated by a convolutional neural network based on the data associated with the one or more equipment. At step 610, one or more objects associated with the one or more equipment is detected based on the trained model. The one or more objects pertain to the display screen of the one or more equipment. The display screen of the one or more equipment is detected from the one or more identified images which includes: (a) one or more identified images is labelled for creating a custom model dataset, a model is trained on the custom model dataset to identify the one or more objects associated with one or more equipment, and (c) the detected one or more objects are cropped within a bounding box to detect the display screen of the one or more equipment. At step 612, the one or more equipment performing one or more technical activities are monitored by the autonomous laboratory monitoring robot 214 at a predefined time interval in the one or more laboratory stations 212A-N. At step 614, one or more docking stations 208A-B which are located nearer to navigate the autonomous laboratory monitoring robot 214 is dynamically detected after an assigned scan is performed. The docking station which is located nearer is detected based on one of (a) current position of the autonomous laboratory monitoring robot 214, or (b) an overall layout of operational path, or (c) a location and a distance to be traveled to reach each docking station, and combination thereof.
[043] In an embodiment, a sound pattern associated with one or more equipment under test, and direction of the sound pattern are identified to monitor the one or more technical activities performed at the one or more laboratory stations 212A-N. In an embodiment, the sound pattern pertains to alarms and alert sounds. In an embodiment the direction from which the sound pattern originates is identified based on one or more signals from a microphone array comprised therein, and triangulation calculations if the equipment is not identifiable. In an embodiment, if a request for an unscheduled scan is received, includes: (a) amount of the power supply needed to complete the requested scan is determined based on location of each equipment, (b) distance required to travel is determined, and (c) remaining charge in the power supply and time of a next scheduled scan is compared. In an embodiment, the one or more docking stations 208A-B are identified based on combination of a radio frequency signal and one or more junctions on the differential drive mobile platform 308. In an embodiment, a docking of the autonomous laboratory monitoring robot 214 at the one or more docking stations 208A-B is performed based on a navigational sensor data obtained from a confidence probability graph.
[044] The embodiment of present disclosure herein addresses unresolved problem of remote monitoring of laboratory equipment. The embodiment thus provides the autonomous laboratory monitoring robot for remote monitoring of equipment inside the laboratory in a collaborative manner. The autonomous laboratory monitoring robot has a mobile base for guided navigation through the guide track. The autonomous laboratory monitoring robot has two degrees of freedom robot arm with telescopic mechanism for focusing the camera on the equipment display panel at each laboratory station according to the height and position of the equipment. For example, the autonomous laboratory monitoring robot may be mounted with smartphone cameras (or any other camera) for precise equipment display detection, orientation correction, extraction of information from captured images, and perspective correction of images. The telescopic mechanism can move vertically as well as horizontally. The robot can automatically detect equipment display and the position are memorized for future surveillance. Using the convolutional neural network, the equipment display panels are localized for extracting the data. The autonomous laboratory monitoring robot can be automatically or manually scheduled for scanning with a combination of time and equipment using an associated user interface. The autonomous laboratory monitoring robot is capable of detecting junction crossings and dynamically adjusting corresponding speed based on radio frequency sensors placed on the floor.
[045] The autonomous laboratory monitoring robot has dynamic speed adjustment, the automatic docking and charging, radio frequency card-based station detection, environment monitoring, and identification of equipments on the path. The autonomous laboratory monitoring robot is built-in a collaborative manner and there are one or more sensors to provide smooth navigation and safety. For example, ultrasonic sensors and vertical cavity surface emitting laser (VCSEL) sensors are used to detect obstacles around the autonomous laboratory monitoring robot and activate quick stop, to avoid collision. Emergency kill switch is provided to manually activate the emergency stop. Wide beam ultrasonic sensors are used as redundant sensors to complement VCSEL sensors in different object surfaces. The autonomous laboratory monitoring robot provides a precise docking procedure at each charging station, and the robot can adjust the height and horizontal angle to precisely monitor the equipment placed at each laboratory station. The users can also schedule scanning cycle based on the web interface. The autonomous laboratory monitoring robot can be promptly trained, even if new laboratory equipment is added at the laboratory station, and there is no need to install any additional hardware, camera, and sensor systems. The autonomous laboratory monitoring robot can detect and control safely when obstacles or people are blocking corresponding path. At the end of each scan or if the battery is critically low, the robot can autonomously navigate to the docking station or may be moved via machine/user intervention towards the docking station. The design of the docking station along with the precise controls allows precise docking procedure for the autonomous laboratory monitoring robot. Even slight misorientations of the autonomous laboratory monitoring robot overcomes a problem for making a good contact, since the charging contacts have the spring-loaded mechanism in the docking station and wide contact pads on the side of the autonomous laboratory monitoring robot. The autonomous laboratory monitoring robot is capable to identify the one or more environment parameters for optimized functioning of the laboratory equipment. The collaborative feature i.e., mirror mechanism provides visual aid to the autonomous laboratory monitoring robot monitoring lab instruments, with across scenarios in which the inclination on the laboratory equipment display screens reduce the visibility of the data displayed. The apparatus when kept over such equipment helps in easier distinguishing of the data displayed on screen. A 360 degree adjustable mirror along with height adjustable mechanism and mirror slider display screens reduce the visibility of the data displayed. The apparatus when kept over such equipment helps in easier distinguishing the data displayed on screens are available.
[046] The written description describes the subject matter herein to enable any person skilled in the art to make and use the embodiments. The scope of the subject matter embodiments is defined by the claims and may include other modifications that occur to those skilled in the art. Such other modifications are intended to be within the scope of the claims if they have similar elements that do not differ from the literal language of the claims or if they include equivalent elements with insubstantial differences from the literal language of the claims.
[047] It is to be understood that the scope of the protection is extended to such a program and in addition to a computer-readable means having a message therein; such computer-readable storage means contain program-code means for implementation of one or more steps of the method, when the program runs on a server or mobile device or any suitable programmable device. The hardware device can be any kind of device which can be programmed including e.g., any kind of computer like a server or a personal computer, or the like, or any combination thereof. The device may also include means which could be e.g., hardware means like e.g., an application-specific integrated circuit (ASIC), a field-programmable gate array (FPGA), or a combination of hardware and software means, e.g., an ASIC and an FPGA, or at least one microprocessor and at least one memory with software processing components located therein. Thus, the means can include both hardware means, and software means. The method embodiments described herein could be implemented in hardware and software. The device may also include software means. Alternatively, the embodiments may be implemented on different hardware devices, e.g., using a plurality of CPUs.
[048] The embodiments herein can comprise hardware and software elements. The embodiments that are implemented in software include but are not limited to, firmware, resident software, microcode, etc. The functions performed by various components described herein may be implemented in other components or combinations of other components. For the purposes of this description, a computer-usable or computer readable medium can be any apparatus that can comprise, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device.
[049] The illustrated steps are set out to explain the exemplary embodiments shown, and it should be anticipated that ongoing technological development will change the manner in which particular functions are performed. These examples are presented herein for purposes of illustration, and not limitation. Further, the boundaries of the functional building blocks have been arbitrarily defined herein for the convenience of the description. Alternative boundaries can be defined so long as the specified functions and relationships thereof are appropriately performed. Alternatives (including equivalents, extensions, variations, deviations, etc., of those described herein) will be apparent to persons skilled in the relevant art(s) based on the teachings contained herein. Such alternatives fall within the scope of the disclosed embodiments. Also, the words “comprising,” “having,” “containing,” and “including,” and other similar forms are intended to be equivalent in meaning and be open ended in that an item or items following any one of these words is not meant to be an exhaustive listing of such item or items, or meant to be limited to only the listed item or items. It must also be noted that as used herein and in the appended claims, the singular forms “a,” “an,” and “the” include plural references unless the context clearly dictates otherwise.
[050] Furthermore, one or more computer-readable storage media may be utilized in implementing embodiments consistent with the present disclosure. A computer-readable storage medium refers to any type of physical memory on which information or data readable by a processor may be stored. Thus, a computer-readable storage medium may store instructions for execution by one or more processors, including instructions for causing the processor(s) to perform steps or stages consistent with the embodiments described herein. The term “computer-readable medium” should be understood to include tangible items and exclude carrier waves and transient signals, i.e., be non-transitory. Examples include random access memory (RAM), read-only memory (ROM), volatile memory, nonvolatile memory, hard drives, CD ROMs, DVDs, flash drives, disks, and any other known physical storage media.
[051] It is intended that the disclosure and examples be considered as exemplary only, with a true scope of disclosed embodiments being indicated by the following claims.

, Claims:
1. An autonomous laboratory monitoring robot (214) for monitoring one or more equipment at one or more laboratory stations (212A-N), wherein the autonomous laboratory monitoring robot (214) comprises:
a memory (104) storing instructions;
one or more communication interfaces (106); and
one or more hardware processors (102) coupled to the memory (104) via the one or more communication interfaces (106), wherein the one or more hardware processors (102) are configured by the instructions to:
navigate, on a differential drive mobile platform (308), the autonomous laboratory monitoring robot (214) at each section of the one or more laboratory stations (212A-N);
align, by a two degree of freedom (DOF) telescopic arm (306), an image capturing unit (302A-B) of the autonomous laboratory monitoring robot (214) at a required angle to focus on a display screen of the one or more equipment at the one or more laboratory stations (212A-N);
obtain, by the image capturing unit (302A-B), one or more images of the one or more equipment to extract data;
generate, by a convolutional neural network, a trained model based on the data associated with the one or more equipment;
detect, one or more objects associated with the one or more equipment based on the trained model, wherein the one or more objects pertain to the display screen of the one or more equipment;
monitor, by the autonomous laboratory monitoring robot (214), the one or more equipment performing one or more technical activities at a predefined time interval in the one or more laboratory stations (212A-N); and
dynamically detect, one or more docking stations (208A-B) which are located nearer to navigate the autonomous laboratory monitoring robot (214) after an assigned scan is performed, wherein the docking station which is located nearer is detected based on one of (a) current position of the autonomous laboratory monitoring robot (214), or (b) an overall layout of operational path, or (c) a location and a distance to be traveled to reach each docking station, and combination thereof.

2. The autonomous laboratory monitoring robot (214) as claimed in claim 1, wherein the two degree of freedom (DOF) telescopic arm (306) traverses vertically, and the image capturing unit (302A-B) scans horizontally by rotating about a corresponding axis to detect the display screen of the one or more equipment.

3. The autonomous laboratory monitoring robot (214) as claimed in claim 1, wherein the data pertains to at least one of: (a) a position, (b) a tilt angle, and (c) a height of the display screen of the one or more equipment.

4. The autonomous laboratory monitoring robot (214) as claimed in claim 1, wherein the one or more hardware processors (102) are further configured by the instructions to detect the display screen of the one or more equipment from the one or more identified images, comprises:
a) label, one or more identified images for creating a custom model dataset;
b) train, a model on the custom model dataset to identify the one or more objects associated with one or more equipment; and
c) crop, the detected one or more objects within a bounding box to detect the display screen of the one or more equipment.

5. The autonomous laboratory monitoring robot (214) as claimed in claim 1, wherein a sound pattern associated with one or more equipment under test, and direction of the sound pattern are identified to monitor the one or more technical activities performed at the one or more laboratory stations (212A-N), wherein the sound pattern pertains to alarms and alert sounds, and wherein the direction from which the sound pattern originates is identified based on one or more signals from a microphone array comprised therein, and triangulation calculations if the equipment is not identifiable.

6. The autonomous laboratory monitoring robot (214) as claimed in claim 1, wherein if a request for an unscheduled scan is received, the one or more hardware processors (102) are further configured by the instructions to:
a) determine, amount of power supply needed to complete the requested unscheduled scan based on location of each equipment;
b) determine, distance required to travel; and
c) compare, remaining charge in the power supply and time of a next scheduled scan.

7. The autonomous laboratory monitoring robot (214) as claimed in claim 1, wherein the one or more docking stations (208A-B) are identified based on combination of a radio frequency signal and one or more junctions on the differential drive mobile platform (308), and wherein a docking of the autonomous laboratory monitoring robot (214) at the one or more docking stations (208A-B) is performed based on a navigational sensor data obtained from a confidence probability graph.

8. A processor implemented method (600) for monitoring one or more equipment at one or more laboratory stations (212A-N) by an autonomous laboratory monitoring robot (214), wherein the method comprising one or more steps of:
navigating, via one or more hardware processors, the autonomous laboratory monitoring robot (214) on a differential drive mobile platform (308) at each section of the one or more laboratory stations (212A-N) (602);
aligning, by a two degree of freedom (DOF) telescopic arm (306), an image capturing unit (302A-B) of the autonomous laboratory monitoring robot (214) at a required angle to focus on a display screen of the one or more equipment at the one or more laboratory stations (212A-N) (604);
obtaining, by the image capturing unit (302A-B), one or more images of the one or more equipment to extract data (606);
generating, by a convolutional neural network, a trained model based on the data associated with the one or more equipment (608);
detecting, via the one or more hardware processors, one or more objects associated with the one or more equipment based on the trained model, wherein the one or more objects pertain to the display screen of the one or more equipment (610);
monitoring, by the autonomous laboratory monitoring robot (214), the one or more equipment performing one or more technical activities at a predefined time interval in the one or more laboratory stations (212A-N) (612); and
dynamically detecting, via the one or more hardware processors, one or more docking stations (208A-B) which are located nearer to navigate the autonomous laboratory monitoring robot (214) after an assigned scan is performed, wherein the docking station which are located nearer is detected based on one of (a) current position of the autonomous laboratory monitoring robot (214), or (b) an overall layout of operational path, or (c) a location and a distance to be traveled to reach each docking station, and combination thereof (614).

9. The processor implemented method (600) as claimed in claim 8, wherein the two degree of freedom (DOF) telescopic arm (306) traverses vertically, and the image capturing unit (302A-B) scans horizontally by rotating about a corresponding axis to detect the display screen of the one or more equipment.

10. The processor implemented method (600) as claimed in claim 8, wherein the data pertains to at least one of: (a) a position, (b) a tilt angle, and (c) a height of the display screen of the one or more equipment.

11. The processor implemented method (600) as claimed in claim 8, wherein the step of detecting the display screen of the one or more equipment from the one or more identified images, comprising:
d) labelling, via the one or more hardware processors, one or more identified images for creating a custom model dataset;
e) training, via the one or more hardware processors, a model on the custom model dataset to identify the one or more objects associated with one or more equipment; and
f) cropping, via the one or more hardware processors, the detected one or more objects within a bounding box to detect the display screen of the one or more equipment.

12. The processor implemented method (600) as claimed in claim 8, wherein a sound pattern associated with one or more equipment under test, and direction of the sound pattern are identified to monitor the one or more technical activities performed at the one or more laboratory stations (212A-N), wherein the sound pattern pertains to alarms and alert sounds, and wherein the direction from which the sound pattern originates is identified based on one or more signals from a microphone array comprised therein, and triangulation calculations if the equipment is not identifiable.

13. The processor implemented method (600) as claimed in claim 8, wherein if a request for an unscheduled scan is received, further comprising:
d) determining, via the one or more hardware processors, amount of power supply needed to complete the requested unscheduled scan based on location of each equipment;
e) determining, via the one or more hardware processors, distance required to travel; and
f) comparing, via the one or more hardware processors, remaining charge in the power supply and time of a next scheduled scan.

14. The processor implemented method (600) as claimed in claim 8, wherein the one or more docking stations (208A-B) are identified based on combination of a radio frequency signal and one or more junctions on the differential drive mobile platform (308), and wherein a docking of the autonomous laboratory monitoring robot (214) at the one or more docking stations (208A-B) is performed based on a navigational sensor data obtained from a confidence probability graph.

Documents

Application Documents

# Name Date
1 202321042221-STATEMENT OF UNDERTAKING (FORM 3) [23-06-2023(online)].pdf 2023-06-23
2 202321042221-REQUEST FOR EXAMINATION (FORM-18) [23-06-2023(online)].pdf 2023-06-23
3 202321042221-FORM 18 [23-06-2023(online)].pdf 2023-06-23
4 202321042221-FORM 1 [23-06-2023(online)].pdf 2023-06-23
5 202321042221-FIGURE OF ABSTRACT [23-06-2023(online)].pdf 2023-06-23
6 202321042221-DRAWINGS [23-06-2023(online)].pdf 2023-06-23
7 202321042221-DECLARATION OF INVENTORSHIP (FORM 5) [23-06-2023(online)].pdf 2023-06-23
8 202321042221-COMPLETE SPECIFICATION [23-06-2023(online)].pdf 2023-06-23
9 202321042221-FORM-26 [16-08-2023(online)].pdf 2023-08-16
10 202321042221-Proof of Right [12-10-2023(online)].pdf 2023-10-12
11 Abstract.1.jpg 2024-01-04