Abstract: An integrated multi-mode and multi-sensing robotic system and method is disclosed. The fusion of multi-sensory information allows the robotic system to function with enhanced human sensing abilities in structured and unstructured environments. The method comprises acquiring optical data from a robotic system and simultaneously estimating calibration of the optical data, synchronizing a plurality of detection sensors and a multiple imaging mechanism mounted on to the robot. The method also involves reconstructing the optical data by using a plurality of confidence values of each detection sensor and weighing an input from a decision support module and mapping data collected cumulatively from an acoustic imagery data and a thermal data to a 3D optical space The robotic system is designed to operate in a manual and autonomous mode independently and simultaneously. The robotic system is also configured to be a continually evolving, self-learning, and intelligence gathering system capable of making smart decisions.
CLIAMS:1. A method for enabling an integrated multi-mode and multi-sensing mechanism in a robot, said method comprising:
acquiring optical data from a robotic system and simultaneously estimating calibration of said optical data;
synchronizing a plurality of detection sensors and at least one multiple imaging mechanism mounted onto said robot, wherein the synchronizing is performed to obtain collated and enhanced resolution images of at least one view of a surrounding;
localizing at least one source of a point of interest for detecting and tracking anomalies by said robot;
reconstructing said optical data by using a plurality of confidence values of each said detection sensor and weighing at least one input from a decision support module;
mapping data collected cumulatively from at least one of an acoustic imagery data and a thermal data to a 3D optical space;
visually representing a superimposition result of a multi-sensory information obtained from the thermal data onto said 3D optical space;
fusing a thermal detection mechanism to detect and localize a plurality of thermal sources by said robot;
fusing a machine olfaction mechanism to allow for automated simulation of an odor by said robot; and
fusing an acoustic mechanism to allow for at least one of acoustic imaging and acoustic mapping by said robot.
2. The method as claimed in claim 1, further comprising creating a perception model and a situational awareness mechanism by using at least one of a sound source localization, a source detection and a mapping of said 3D optical space.
3. The method as claimed in claim 1, wherein said localizing includes at least one of a thermal source localization and the 3D opto-thermal mapping.
4. The method as claimed in claim 1, wherein said fusion of said acoustic mechanism includes fusing of acoustic data acquired by said acoustic imaging and said acoustic thermal mapping.
5. The method as claimed in claim 1, wherein said fusion of said machine olfaction mechanism allows for at least one of a gas detection and a 3D gas distribution mapping.
6. The method as claimed in claim 1, wherein said acoustic mechanism separates a plurality of sound sources and classifies said sound sources into a plurality of sound-related categories.
7. The method as claimed in claim 1, wherein said thermal detection mechanism obtains temperature information using a camera to perform a 3D opto-thermal mapping.
8. The method as claimed in claim 1, wherein said thermal detection mechanism detects a plurality of objects to assess the surface temperature of said objects.
9. The method as claimed in claim 1, wherein said acoustic mechanism prepares at least one acoustically synthesized image of an environment to augment 3D vision and present a perception model.
10. The method as claimed in claim 9, wherein said perception model is presented when vision of said robot is impaired.
11. The method as claimed in claim 1, wherein said acoustic mechanism performs an acoustic thermal mapping procedure when at least one of an active imaging procedure is carried out and a thermal source is localized.
12. The method as claimed in claim 1, wherein said integrated multi-mode and multi-sensing mechanism estimates and tracks at least one of a direction and a distance of at least one of a heat, a sound, a gas, and an odor source.
13. The method as claimed in claim 1, wherein said integrated multi-mode and multi-sensing mechanism processes a plurality of physical inputs to evaluate a plurality of sensory data confidence values.
14. The method as claimed in claim 1, wherein said acoustic mechanism insonifies a location by sweeping a plurality of frequencies and processing a back-scattered data to perform an anomaly detection.
15. The method as claimed in claim 3, wherein said 3D opto-thermal mapping is performed by at least one of an attention model and an experiential sampling.
16. The method as claimed in claim 1, wherein said visually representing of a superimposition result of a multi-sensory information includes superimposing an acoustic imagery data to localize a heat source.
17. The method as claimed in claim 1, wherein said acoustic mechanism localizes a plurality of sound sources by employing at least one of a time-difference of arrival (TDOA) and an angle of arrival (AOA) technique to separate and classify a plurality of sound sources.
18. The method as claimed in claim 1, wherein said integrated multi-mode and multi-sensing mechanism enables said robot to detect and track each said point of interest by:
a) navigating around said point of interest in a defined path to capture at least one of optical and thermal images;
b) reconstructing in a 3D manner, a plurality of optical images obtained from at least one 2D image, wherein said 2D image is acquired from an odometer and inertial measurement unit;
c) mapping between at least one of an optical and a thermal sensor by using 3D to 2D back-projection technique, wherein a color value obtained from the thermal sensor is mapped onto said 3D optical space to establish 3D opto-thermal mapping;
d) coupling the acoustic imaging with said 3D opto-thermal mapping to obtain
e) using 3D gas distribution mapping to create a spatial representation of a gas distribution using a plurality of gas sensors equipped in said robot.
19. The method as claimed in claim 1, wherein said integrated multi-mode and multi-sensing mechanism uses a multi- stage decision support mechanism for:
a) choosing a plurality of sensors for each particular purpose of detecting and tracking and deciding a data weightage of said plurality of sensors;
b) implementing a man-in-loop control override for each said particular purpose of detecting and tracking, wherein said robot provides control to the man-in-loop control, when the robot is unable to perform said particular purpose of detecting and tracking;
c) switching between a manual and an autonomous mode; and
d) simultaneously performing in the manual and the autonomous mode.
20. The method as claimed in claim 1, wherein said thermal detection mechanism is used to detect a plurality of thermal sources and receive a plurality of thermal signal power levels.
21. An integrated multi-mode and multi-sensing robotic system configured for detecting, tracking, and surveillance of at least one point of interest, said system comprising:
an optical detection module configured to acquire optical data and simultaneously estimate calibration of said optical data, wherein said optical detection module is further configured to estimate distance of the point of interest and obtain a real time depth map of a surrounding environment;
an autonomous robotic navigation module configured to track and monitor at least one said point of interest and navigate around said point of interest;
a sensor fusion module configured to create a perception model and develop a situational awareness map around said point of interest;
a reconstruction module configured to reconstruct said optical data by using at least one of a plurality of confidence values and said perception model;
a thermal detection module configured to estimate temperature of the point of interest and perform a 3D opto-thermal mapping by using at least one of an attention-model and an experiential sampling technique;
a machine olfaction module configured to generate a spatial representation of a gas distribution of said point of interest by 3D gas distribution mapping; and
an acoustic detection module configured to enable at least one of an acoustic imaging and an acoustic mapping of said point of interest, wherein said acoustic imaging uses a sound wave transmitted by an active transducer and a back scatter wave received by a plurality of transducers.
22. The robotic system as claimed in claim 21, wherein said robotic system is configured to use a plurality of sensors, wherein the plurality of sensors are at least one of a microphone array, a thermopile, a thermal camera, an ultrasonic transducer, and a gas sensor.
23. The robotic system as claimed in claim 21, wherein said robotic system is configured to work in a surrounding environment, wherein said environment is at least one of an open, a dark, a smoky, and a foggy environment.
24. The robotic system as claimed in claim 21, wherein said acoustic detection module is configured to use a microphone array to localize at least one sound source and separate and classify said sound source into a plurality of pre-determined categories.
25. The robotic system as claimed in claim 21, wherein said thermal detection module is configured to use a thermopile, said thermopile adapted to detect and localize at least one thermal source.
26. The robotic system as claimed in claim 21, wherein said acoustic detection module is configured to prepare an acoustically synthesized image of a surrounding environment and provide a perception model when vision of the robotic system is impaired.
27. The robotic system as claimed in claim 21, wherein said autonomous robotic navigation module is configured to estimate a plurality of parameters, wherein the parameters are at least one of a distance and a direction of a heat, a sound, and an odor source.
28. The robotic system as claimed in claim 21, wherein said sensor fusion model is further configured to implement a multi-modal sensor fusion mechanism to process a plurality of physical inputs and evaluate a plurality of the data confidence values.
29. The robotic system as claimed in claim 21, wherein said thermal detection module engages a plurality of infra-red thermal detectors to implement a mono-pulse direction finding mechanism.
30. The robotic system as claimed in claim 21, wherein said thermal detection module implements at least one of a thermal source localization mechanism and engages an 3D opto-thermal mapping module to perform at least one of an attention model and the experiential sampling technique.
31. The robotic system as claimed in claim 21, wherein said acoustic detection module is configured to insonify a location by performing a frequency sweep and processing a back-scattered data to perform an anomaly detection.
32. The robotic system as claimed in claim 21, wherein said thermal detection module is further configured to:
a) enable a 3D reconstruction of a plurality of optical images of the point of interest from a plurality of 2D images, wherein said 3D reconstruction is calibrated by an odometer and an inertial measurement unit;
b) enable a 3D to 2D back-projection technique to perform mapping between at least one of an optical and a thermal sensor; and
c) map a plurality of color values of a thermal imagery.
33. The robotic system as claimed in claim 21, wherein said machine olfaction detection module is configured to estimate a position of the point of interest by implementing a visual odometry mechanism and a range finder to represent the source of the gas in a 3D map.
34. The robotic system as claimed in claim 21, wherein said system further comprises a multi-stage decision support module, said decision support module further configured to:
a) select at least one decision sensor and a data weightage of the at least one decision sensor;
b) override control of the decision sensor and engage in a manual mode;
c) hand-off control between the manual mode and an autonomous mode, wherein said manual mode is controlled by a man-in-loop control module; and
d) function simultaneously in the manual mode and the autonomous mode.
35. The robotic system as claimed in claim 21, wherein said optical detection module is further configured to synchronize a plurality of captured images and a plurality of optical detection sensors to implement 3D reconstruction.
36. The robotic system as claimed in claim 21, wherein said optical detection module implements 3D reconstruction of the optical data by using at least one confidence value from amongst said plurality of confidence values, an optical detection sensor and a weighing input from a two-stage decision support module.
37. The robotic system as claimed in claim 21, wherein said optical detection module is configured to visually represent a superimposition result of a multi-sensory information onto a 3D optical space.
38. The robotic system as claimed in claim 21, wherein said machine olfaction module comprises a plurality of gas sensors adapted to determine at least one of a source of gas and an area of high concentration of the gas.
39. The robotic system as claimed in claim 21, wherein said robotic system is configured to function, co-operate, and co-ordinate with at least one second robotic system.
40. The robotic system as claimed in claim 39, wherein said robotic system is configured to alert the second robotic system.
,TagSPECI:As Attached
| # | Name | Date |
|---|---|---|
| 1 | 498-MUM-2015-POWER OF AUTHORITY-(21-04-2015).pdf | 2015-04-21 |
| 2 | 498-MUM-2015-CORRESPONDENCE-(21-04-2015).pdf | 2015-04-21 |
| 3 | RELEVANT DOCUMENT.pdf | 2018-08-11 |
| 4 | PD015497IN-SC - SPEC FOR FILING.pdf ONLINE | 2018-08-11 |
| 5 | PD015497IN-SC - SPEC FOR FILING.pdf | 2018-08-11 |
| 6 | PD015497IN-SC - FORM 5.pdf ONLINE | 2018-08-11 |
| 7 | PD015497IN-SC - FORM 5.pdf | 2018-08-11 |
| 8 | PD015497IN-SC - FORM 3.pdf ONLINE | 2018-08-11 |
| 9 | PD015497IN-SC - FORM 3.pdf | 2018-08-11 |
| 10 | PD015497IN-SC - DRAWIINGS FOR FILIING.pdf ONLINE | 2018-08-11 |
| 11 | PD015497IN-SC - DRAWIINGS FOR FILIING.pdf | 2018-08-11 |
| 12 | FORM 13.pdf | 2018-08-11 |
| 13 | 498-MUM-2015-Form 1-050815.pdf | 2018-08-11 |
| 14 | 498-MUM-2015-Correspondence-050815.pdf | 2018-08-11 |
| 15 | 498-MUM-2015-FER.pdf | 2019-06-10 |
| 16 | 498-MUM-2015-OTHERS [10-12-2019(online)].pdf | 2019-12-10 |
| 17 | 498-MUM-2015-FER_SER_REPLY [10-12-2019(online)].pdf | 2019-12-10 |
| 18 | 498-MUM-2015-DRAWING [10-12-2019(online)].pdf | 2019-12-10 |
| 19 | 498-MUM-2015-ABSTRACT [10-12-2019(online)].pdf | 2019-12-10 |
| 20 | 498-MUM-2015-PatentCertificate24-11-2023.pdf | 2023-11-24 |
| 21 | 498-MUM-2015-IntimationOfGrant24-11-2023.pdf | 2023-11-24 |
| 1 | tpo_07-06-2019.pdf |