Sign In to Follow Application
View All Documents & Correspondence

A System And Method For Determining 2 D Localization Of A Target In Confined Spaces

Abstract: The present invention relates to a system (100) for determining 2D localization of a target in confined spaces. In one embodiment, the system comprising: at least one image acquisition device (110) configured to capture an image of an area of interest of covering the target on a plane surface, at least one pose control device (120) configured to measure and adjust a position and an orientation of the image acquisition device relative to the plane surface to identify and capture the target in the area of interest and at least one processing device (130) in communication with the image acquisition device (110) and the pose control device (120), and configured to receive the captured image of the target from the image acquisition device and measured position and the orientation of the image acquisition device from the pose control device to determine the 2D localization of the target. Figure 2 (for publication)

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
04 June 2024
Publication Number
27/2024
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
Parent Application

Applicants

Planys Technologies Pvt. Ltd.
No. 5 Jaya Nagar Extension, Balaji Nagar Main Road, G.K. Avenue, Puzhuthivakkam, Chennai 600091, Tamil Nadu, India

Inventors

1. Vishnu Venkatesh
5 Jaya Nagar extension, Balaji Nagar Main Rd, G.K. Avenue, Puzhuthivakkam, Chennai 600091, Tamil Nadu, India
2. Ashish Antony Jacob
5 Jaya Nagar extension, Balaji Nagar Main Rd, G.K. Avenue, Puzhuthivakkam, Chennai 600091, Tamil Nadu, India
3. Vineet Upadhyay
5 Jaya Nagar extension, Balaji Nagar Main Rd, G.K. Avenue, Puzhuthivakkam, Chennai 600091, Tamil Nadu, India

Specification

Description:FORM 2
THE PATENTS ACT, 1970
(39 of 1970)
&
THE PATENTS RULES, 2003
COMPLETE SPECIFICATION
(See section 10, rule 13)

“A SYSTEM AND METHOD FOR DETERMINING 2D LOCALIZATION OF A TARGET IN CONFINED SPACES”

By
Planys Technologies Private Limited, an Indian company No. 5 Jaya Nagar Extension, Balaji Nagar Main Road, G.K. Avenue, Puzhuthivakkam, Chennai 600091, Tamil Nadu, India

The following specification particularly describes the invention and the manner in which it is to be performed.

FIELD OF THE INVENTION
[0001] The present disclosure/invention relates in general to method and system for extracting 2D position of a target and more particularly to a vision-based method and system for determining 2D-localization in confined spaces in the absence of fiducial markers.
BACKGROUND OF THE INVENTION
[0002] Generally, it is sometimes necessary to inspect spaces not safely accessible to humans. In these environments robots can be employed to carry out inspections, but their motion needs to be tracked in order to identify regions of interest. Some of the locations that need to be inspected are inside large storage tanks. Immediately, this restricts the types of positioning systems that can be used. These environments also cannot be accessed in order to setup any equipment to aid the inspection, which further limits our choices.
[0003] Nowadays, there are many applications that require accurate positioning in enclosed spaces. In order to solve the above-mentioned limitations, there are many available technologies for tracking and localizing moving targets, but many can fail, or are difficult to deploy in certain environments. Therefore, there is a need for a positioning system that can work in enclosed indoor spaces without prior preparation.
[0004] Some of the typical positioning systems are provided below.
[0005] GNSS (Global navigation satellite system) is a satellite system comprises a network of satellites broadcasting timing and orbital information used for navigation and positioning measurements. GNSS uses messages received from satellite constellations for navigation and timing. In general, GNSS receivers’ access one or more of the available constellations like GPS, Galileo, GLONASS or BeiDou to retrieve position (latitude, longitude, altitude), navigation (heading, bearing) and timing (current timestamp) information. GNSS positioning is extremely accurate, but unreliable in indoor settings.
[0006] Acoustic/SONAR systems use time-of-arrival and direction-of-arrival information of acoustic signals to determine the location of target. Systems can send out their own acoustic signals and listen for echoes (active), or they can be used in a pinger/microphone configuration (passive). However, in a reflective environment, there may be ambiguity when processing received signals, leading to uncertainty in position.
[0007] Inertial positioning system comprises Inertial measurement units (IMUs) that provide information from accelerometers (linear acceleration) and gryoscopes (pose and angular acceleration) from which position can be calculated by numerical integration. In this system, while the environment does not matter, the accuracy of this technique depends heavily on the quality of the sensors used and the post processing algorithms.
[0008] RADAR/LIDAR use time-of-flight of electromagnetic signals to determine distance. These are typically active systems with a transmitter and receiver. LIDAR systems are typically used for mapping, while RADAR systems are used for distance and velocity estimation. These systems can be used for localization but are more suited for obstacle detection and very expensive.
[0009] Direct field measurement uses earth’s gravitational and magnetic fields to determine orientation. Gyroscopes use earth’s magnetic field as a reference to determine orientation of a body. Magnetometers can be used like a compass to provide a north reference. However, earth’s magnetic field can be distorted inside enclosed spaces and is not useful.
[0010] Visual positioning uses markers on the target to determine position and orientation. Markers often feature materials or patterns that are easy to discern from the environment. They provide a size reference which can be used to determine a mapping between pixel coordinates and physical distances. The process of obtaining a position estimate through vision-based systems would typically require the use of fiducial markers [reference markers used in imaging or measurement systems to establish a frame of reference], however, it is not feasible in some environments to place those markers ahead of time.
[0011] Positioning systems that require prior placement of markers or transceivers cannot be used, as access to the space may be restricted or hazardous. Indoor environments can have walls, and other obstructions that can degrade signals and affect systems that rely on time and angle of arrival for positioning. Finally, access to these locations may be limited and may restrict the deployment of equipment. These conditions severely limit the types of positioning systems that can be deployed, and that is before factoring in cost, complexity, and accuracy.
[0012] Therefore, there is a need in the art with a method and system for determining 2D-localization in confined spaces in the absence of fiducial markers and also to solve the above-mentioned limitations.
OBJECTIVE OF THE INVENTION
[0013] The main objective of the present invention is to provide a system and method for determining 2d localization of a target in confined spaces in the absence of fiducial markers.
SUMMARY OF THE INVENTION

[0014] An aspect of the present invention is to address at least the above-mentioned problems and/or disadvantages and to provide at least the advantages described below.
[0015] Accordingly, in one aspect of the present invention relates to a system (100) for determining 2D localization of a target in confined spaces. The system comprising: at least one image acquisition device (110) configured to capture an image of an area of interest of covering the target on a plane surface, at least one pose control device (120) configured to measure and adjust a position and an orientation of the image acquisition device relative to the plane surface to identify and capture the target in the area of interest and at least one processing device (130) in communication with the image acquisition device (110) and the pose control device (120), and configured to receive the captured image of the target from the image acquisition device and measured position and the orientation of the image acquisition device from the pose control device to determine the 2D localization of the target.
[0016] Another aspect of the present invention relates to a method (500) for determining 2D localization of a target in confined spaces. The method comprising: capturing (510), by at least one image acquisition device, an image of an area of interest of covering the target on a plane surface, measuring and adjusting (520), by at least one pose control device, a position and an orientation of the image acquisition device relative to the plane surface to identify and capture the target in the area of interest, receiving (530), by at least one processing device, the captured image of the target from the image acquisition device and measured position and the orientation of the image acquisition device from the pose control device and determining (540), by at least one processing device, the 2D localization of the target based on the received image and measured position and the orientation of the image acquisition device.
[0017] Other aspects, advantages, and salient features of the invention will become apparent to those skilled in the art from the following detailed description, which, taken in conjunction with the annexed drawings, discloses exemplary embodiments of the invention.
BRIEF DESCRIPTION OF ACCOMPANYING DRAWINGS
[0018] The detailed description is described with reference to the accompanying figures. In the figures, the left-most digit(s) of a reference number identifies the figure in which the reference number first appears. The same numbers are used throughout the drawings to reference like features and modules.
[0019] Figure 1 shows a simplified block diagram of system for determining 2D localization of a target in confined spaces according to an exemplary implementation of the present disclosure/ invention.
[0020] Figure 2 shows a simplified schematic of the system for determining 2D localization of a target in confined spaces according to an exemplary implementation of the present disclosure/ invention.
[0021] Figure 3 shows a representation of a captured image after object detection and pixel coordinate extraction according to an exemplary implementation of the present disclosure/ invention.
[0022] Figure 4 shows a deployment of the system as shown in figures 1 and 2 according to an exemplary implementation of the present disclosure/ invention.
[0023] Figure 5 shows a method for determining 2D localization of a target in confined spaces according to an exemplary implementation of the present disclosure/ invention.
[0024] Figure 6 shows an example deployment of the system with target moved to different locations in the region of interest according to an exemplary implementation of the present disclosure/ invention.
[0025] Figure 7 shows process flowchart for an embodiment of the invention according to an exemplary implementation of the present disclosure/ invention.
[0026] Figure 8 shows geometric relationship used to transform image space to physical space according to an exemplary implementation of the present disclosure/ invention.
[0027] Figure 9 shows an example working of the system according to an exemplary implementation of the present disclosure/ invention.
[0028] It should be appreciated by those skilled in the art that any block diagrams herein represent conceptual views of illustrative methods embodying the principles of the present disclosure. Similarly, it will be appreciated that any flow charts, flow diagrams, and the like represent various processes which may be substantially represented in computer readable medium and so executed by a computer or processor, whether or not such computer or processor is explicitly shown.
DETAILED DESCRIPTION OF THE INVENTION

[0029] The following description with reference to the accompanying drawings is provided to assist in a comprehensive understanding of exemplary embodiments of the invention as defined by the claims and their equivalents. It includes various specific details to assist in that understanding but these are to be regarded as merely exemplary. Accordingly, those of ordinary skill in the art will recognize that various changes and modifications of the embodiments described herein can be made without departing from the scope and spirit of the invention. In addition, descriptions of well-known functions and constructions are omitted for clarity and conciseness.
[0030] The terms and words used in the following description and claims are not limited to the bibliographical meanings, but, are merely used by the inventor to enable a clear and consistent understanding of the invention. Accordingly, it should be apparent to those skilled in the art that the following description of exemplary embodiments of the present invention are provided for illustration purpose only and not for the purpose of limiting the invention as defined by the appended claims and their equivalents.
[0031] It is to be understood that the singular forms “a,” “an,” and “the” include plural referents unless the context clearly dictates otherwise. Thus, for example, reference to “a component surface” includes reference to one or more of such surfaces.
[0032] By the term “substantially” it is meant that the recited characteristic, parameter, or value need not be achieved exactly, but that deviations or variations, including for example, tolerances, measurement error, measurement accuracy limitations and other factors known to those of skill in the art, may occur in amounts that do not preclude the effect the characteristic was intended to provide.
[0033] Figures discussed below, and the various embodiments used to describe the principles of the present disclosure in this patent document are by way of illustration only and should not be construed in any way that would limit the scope of the disclosure. Those skilled in the art will understand that the principles of the present disclosure may be implemented in any suitably arranged communications system. The terms used to describe various embodiments are exemplary. It should be understood that these are provided to merely aid the understanding of the description, and that their use and definitions in no way limit the scope of the invention. Terms first, second, and the like are used to differentiate between objects having the same terminology and are in no way intended to represent a chronological order, unless where explicitly stated otherwise. A set is defined as a non-empty set including at least one element.
[0034] In the following description, for purpose of explanation, specific details are set forth in order to provide an understanding of the present disclosure. It will be apparent, however, to one skilled in the art that the present disclosure may be practiced without these details. One skilled in the art will recognize that embodiments of the present disclosure, some of which are described below, may be incorporated into a number of systems.
[0035] However, the systems and methods are not limited to the specific embodiments described herein. Further, structures and devices shown in the figures are illustrative of exemplary embodiments of the presently disclosure and are meant to avoid obscuring of the presently disclosure.
[0036] It should be noted that the description merely illustrates the principles of the present invention. It will thus be appreciated that those skilled in the art will be able to devise various arrangements that, although not explicitly described herein, embody the principles of the present invention. Furthermore, all examples recited herein are principally intended expressly to be only for explanatory purposes to help the reader in understanding the principles of the invention and the concepts contributed by the inventor to furthering the art and are to be construed as being without limitation to such specifically recited examples and conditions. Moreover, all statements herein reciting principles, aspects, and embodiments of the invention, as well as specific examples thereof, are intended to encompass equivalents thereof.
[0037] Various embodiments of the present invention are further described with reference to FIGS. 1 to FIGS. 9.
[0038] With the challenges described above, an invention is proposed, the purpose of which is to provide reliable and accurate positions, with ease of deployment and minimum human intervention.
[0039] Figure 1 illustrates a simplified block diagram representation of the system (100), in which at least some embodiments of the present disclosure can be implemented. The system (100) comprises one or more devices configured to determine the 2D localization of a target in confined spaces. Although the system (100) is depicted to include one or more devices, components or modules arranged in a particular arrangement in the present disclosure, it should not be taken to limit the scope of the present disclosure.
[0040] To achieve its purpose, the invention provides a positioning and tracking system, which comprises of an image acquisition device (referred to hereafter as, imaging device), a mounting device to secure the image acquisition device (henceforth, mount), a target device to be tracked by the image acquisition device (henceforth, target), a pose control device for the image acquisition device (henceforth, pose control system), and a processing device (henceforth, computer).
[0041] In one embodiment of the present invention, the system (100) for determining 2D localization of a target in confined spaces comprises: at least one image acquisition device (110) configured to capture an image of an area of interest of covering the target on a plane surface, at least one pose control device (120) configured to measure and adjust the position and the orientation of the image acquisition device relative to the plane surface to identify and capture the target in the area of interest, at least one processing device (130) in communication with the image acquisition device (110) and the pose control device (120), and configured to receive the captured image of the target from the image acquisition device and measured position and the orientation of the image acquisition device from the pose control device to determine the 2D localization of the target.
[0042] The image acquisition device/imaging device is first mounted to a secure location with the area of interest in its field of view. In order to estimate the position of the target, the image acquisition device/imaging device captures an image of the area of interest and sends it to the processing device along with data from the pose control device/system. In case the target is not in the view of the image acquisition device /imaging device, the image acquisition device’s pose may be adjusted by the pose control device. The image is then processed on the processing device to extract position data, which is communicated to the operator.
[0043] The image acquisition device (110) is a camera with one or more integrated sensors configured to monitor and capture the images of the target in different spectra, which moves on the plane surface.
[0044] The pose control device (120) is a pose control system comprising at least any one of inclinometers, accelerometers, gyroscopes, non-contact distance sensors and inertial measurement units to measure the position and the orientation of the image acquisition device. The pose control device further comprises at least any one of mechanical joints, integrated motors and actuators to adjust the position and the orientation of the image acquisition device relative to the plane surface.
[0045] The processing device (130) is a computing device (computer) that is configured to receive the captured image of the target from the image acquisition device and measured position and the orientation of the image acquisition device from the pose control device to determine the 2D localization of the target. The processing device (130) determine the 2D localization information of the target on the plane surface based on the position and the orientation information of the image acquisition device relative to the plane surface and a pixel location of the target in the image.
[0046] The computing device may include one or more components/operating modules for determining 2D localization information of the target. In an example, the computing device (130) may be smartphones, laptop computer, personal computer systems, server computer systems, thin clients, thick clients Machines, handheld or laptop devices, microprocessor-based systems, networked personal computers, small computer systems, mainframe computer systems, and distributed cloud computing technology environments including any of the above, and the like. Although the computing device is depicted to include one or a few components, modules, or devices arranged in a particular arrangement in the present disclosure, it should not be taken to limit the scope of the present disclosure. Further, two or more components may be embodied in one single component, and/or one component may be configured using multiple sub-components to perform the desired functionalities. Some components of the processing device (130) may be configured using hardware elements, software elements, firmware elements, and/or a combination thereof.
[0047] Further, the processing device (130) is capable of executing the machine executable instructions to perform the functions described above. In an embodiment, the processing device (130) may be implemented as a multi-core processor, a single core processor, or a combination of one or more multi-core processors and one or more single core processors. For example, the processing device (130) may be embodied as one or more of various processing devices, such as a coprocessor, a microprocessor, a controller, a digital signal processor (DSP), an application specific integrated circuit (ASIC), a field programmable gate array (FPGA), a microcontroller unit (MCU), or the like.
[0048] Figure 2 shows a simplified schematic of the system for determining 2D localization of a target in confined spaces according to an exemplary implementation of the present disclosure/ invention.
[0049] The figure shows a simplified schematic of the system used for determining 2D localization of a target in confined spaces. The system comprises at least one image acquisition device (110), at least one pose control device (120), at least one processing device (130) and at least one mounting device (140).
[0050] The system comprises at least one image acquisition device (110) with integrated sensors, a target to track and a processing device/computer (130) to process the data. The pose control device (120) measures the current pose of the image acquisition device, and the distance of the image acquisition device from the plane. The image acquisition device tracks the target, which moves on the plane. The image of the target from image acquisition device and pose data of the image acquisition device are sent to the processing device for post processing and the extracted position of the target is sent to the operator.
[0051] The mounting device (140) of the system is configured to hold the pose control device (120) and the image acquisition device (130). In the present invention, the target can be a stationary or non-stationary target, varying in size, shape, appearance and intrinsic abilities. The actuators of the pose control device control the distance between the image acquisition device and the area of interest, and the integrated motors control angular movements of the image acquisition device.
[0052] In the present invention, the pose control device is configured to keep the target in view of the image acquisition device, by changing the pose of the image acquisition device based on the movement of the target. The processing device is configured to determine the 2D localization information of the target on the plane surface based on the position and the orientation information of the image acquisition device relative to the plane surface and a pixel location of the target in the image. The system of the present invention is to use a camera with integrated sensors to extract the 2D position of a target in a confined space in the absence of fiducial markers.
[0053] The present invention’s approach to positioning avoids the complexities posed when navigating inaccessible, enclosed spaces. The present invention system can be used to visually track a target in enclosed spaces where conventional positioning methods fail.
[0054] As a passive system, the present invention system is less complex than conventional positioning systems which rely on communication between disparate components via acoustic or electromagnetic signals. The present invention system does not require placement of fiducial markers prior to deployment.
[0055] The most basic deployment for this system is positioning on flat, horizontal surfaces, such as the surface of water. However, this method and system can also be used on other surfaces as long as a model of that surface is available beforehand. In some cases, the pose control system can also be used to determine the contour of the plane of interest.
[0056] Figure 3 shows a representation of a captured image after object detection and pixel coordinate extraction according to an exemplary implementation of the present disclosure/ invention.
[0057] The figure shows the representation of a captured image after object detection and pixel coordinate extraction. The figure shows an example of target detection on a plane. The bounding box is generated, and a position (x, y) is extracted. The process that detects the target depends on the specific embodiment of the target. For example, a target of significantly different color to the surroundings may be identified by pixel segmentation. In other embodiments, targets with distinct features can be tracked by object detection models like YOLO. In case automatic systems fail to detect the target, the operator can also manually select the location of the target. In each scenario, a pixel location on the image is returned.
[0058] Figure 4 shows a deployment of the system as shown in figures 1 and 2 according to an exemplary implementation of the present disclosure/ invention.
[0059] The figure shows an alternative view of the embodiment visualized in figures 1 and 2. The figure represents deployment from another angle. The target is placed on the plane’s surface. The target is capable of movement, either autonomous or remote-controlled. The present invention system is to estimate the position of the target and make it available to the operator, or other systems. One possible application is mapping the movement of the target in a given space. Another application is using the position to make decisions on future movement of the target.
[0060] Figure 5 shows a method for determining 2D localization of a target in confined spaces according to an exemplary implementation of the present disclosure/ invention.
[0061] The figure shows the method for determining 2D localization of a target in confined spaces. The method (500) comprising: capturing (510), by at least one image acquisition device, an image of an area of interest of covering the target on a plane surface, measuring and adjusting (520), by at least one pose control device, a position and an orientation of the image acquisition device relative to the plane surface to identify and capture the target in the area of interest, receiving (530), by at least one processing device, the captured image of the target from the image acquisition device and measured position and the orientation of the image acquisition device from the pose control device and determining (540), by at least one processing device, the 2D localization of the target based on the received image and measured position and the orientation of the image acquisition device.
[0062] The method further comprises calibrating, prior to deployment, the image acquisition device, pose control device and the processing device to determine intrinsic parameters of devices.
[0063] Figure 6 shows an example deployment of the system with target moved to different locations in the region of interest according to an exemplary implementation of the present disclosure/ invention.
[0064] The figure shows an example deployment of the system with target moved to different locations in the region of interest. In this embodiment, the mounting device is configured to hold the pose control device and plurality of image acquisition devices.
[0065] The figure represents an alternate deployment of an embodiment of the invention. The pose control device/system (depicted here as an integrated system, but can be made of disparate components as required) can be used to change the orientation of the imaging device to keep the target in view, as it moves in the region of interest. The pose control device/system can employ one or more of the following to extract information about the imaging device’s location and orientation: inclinometers, accelerometers, gyroscopes, non-contact distance sensors, inertial measurement units etc. Other possible embodiments would have the pose control device/system integrated directly with the imaging device. The pose of the imaging device can be controlled manually or by integrated motors and actuators, either by operator control or autonomously. Some embodiments can offer more degrees of freedom based on the movements of the target. Linear actuators can be used to control the distance between the camera and the plane of interest, while motors can be used to control angular movements. For lowered complexity, these movements can be provided through joints and bearings for manual adjustment.
[0066] The imaging device provides a still image or a live video feed of the target. Depending on the application, the imaging device can provide images in different spectra (infrared, thermal etc.). The imaging device may have integrated sensors and actuators that can be utilized in the pose control system. The imaging device and pose control system employ wired or wireless communications to transmit data for processing. Communication between the operator and processing device (computer) can also be wired or wireless, as required. Prior to deployment, the imaging device will undergo a calibration step to determine its intrinsic parameters. Acquired images may also need to undergo pre-processing before being used for position estimation.
[0067] The target is the device that is to be tracked by the imaging device. The target may take on many embodiments, varying in size, shape, appearance and intrinsic abilities such as, but not limited to, illumination and locomotion. An example embodiment is an illuminated buoy that can float on the surface of water, tethered to a moving body.
[0068] Another embodiment could be a crawling robot moving on a ceiling. The target needs to be visible to the imaging device for it to be detected and a position determined. The pose control system is used to keep the target in view of the imaging device, by changing the pose of the imaging device accordingly.
[0069] The design of the mount for the imaging device and pose control system can take on many different embodiments depending on the specific embodiments of the imaging device and pose control system, access to the space where positioning needs to be done, location of the plane of interest. For example, in a scenario where the target needs to be positioned on the floor of a container with only one opening on the containers roof, an embodiment of the mount would have the imaging device and pose control system hanging from the access point, with the imagine device posed to face the floor plane. The mount may have mechanical features to secure it to anchor points identified in the environment. For example, the mount can be fastened to the lip of the tank manhole to keep it secure, with provisions for raising and lowering the imaging device, and changing its angle. When positioning on a ceiling is required, a tripod is a possible mounting solution. For situations where access to space is unconventional, an assessment of the environment is required to identify an ideal mount design.
[0070] The computer can be any commercially available system compatible with the imaging device and capable of running the process steps. The process steps can vary depending on the specific embodiments of other components of the invention. A sample process flowchart is described in figure 7. Most embodiments of the invention will have the following features in the process: Image capture from the imaging device, pose data capture from the pose control system, target detection in the captured image and estimation of the target position. Variable aspects of the process include, but are not limited to: communication between the imaging device/pose control system and the computer, communication between the computer and operator, visualization of data, data logging etc.
[0071] Figure 7 shows process flowchart for an embodiment of the invention according to an exemplary implementation of the present disclosure/ invention.
[0072] The process begins with the initialization of hardware and processing modules, including the imaging device, pose control system, object detection model, data logging and estimated position visualization (Bird’s Eye View Map). The process then enters its main loop: if the process has not exited, then capture a new image and pose data, estimate position of the target, and update the data log and the position visualization. Once the process has exited, the data acquired during the process is saved and exported to the operator.
[0073] Figure 8 shows geometric relationship used to transform image space to physical space according to an exemplary implementation of the present disclosure/ invention.
[0074] The figure shows geometric relationship used to transform image space to physical space. After an image is acquired, the target is detected, and its pixel location is extracted (as exemplified in figure 3). This pixel location is fed into a process to convert it into real world position. The process that maps image space to physical spaces relies on the relative geometry between the imaging device and the plane of interest. The basic form of the equation to estimate position (based on the symbols shown in figure 8) is d = h*tan(f), where ‘f’ is determined from the pose of the camera relative to the plane, the intrinsic camera parameters and the pixel location of the target in the captured image. The real world position is then returned to the operator. Depending on the embodiment, this can be in text, or in the form of a bird’s eye view map or other representations.
[0075] Figure 9 shows an example working of the system according to an exemplary implementation of the present disclosure/ invention.
[0076] The figure shows another embodiment of the working of the system. Another embodiment of this invention has been designed for positioning a remotely operated vehicle inside storage tanks. Many conventional positioning systems fail in this environment. Access to the tank is limited to a manhole on the roof, through which the remotely operated vehicle (ROV) and the positioning assembly must be deployed.
[0077] In this embodiment, the mount and pose control device of the system allows three degrees of freedom for the camera: vertical translation, yaw rotation and pitch rotation. In another embodiment, the mount and pose control device of the system may allow lateral movement and roll control of the camera.
[0078] The mount device consists of a metal frame that attaches to the manhole edge. A long shaft goes through this frame and holds the camera assembly. This shaft can be rotated to adjust the camera’s yaw angle. This shaft can also be raised or lowered to adjust the camera height. The camera assembly contains an angle indexing plate to adjust the camera’s pitch. The chosen camera has built-in sensors for return pose estimation. Pose control is handled manually. An ultrasonic level sensor is used to determine the height of the camera relative to the water surface (which can change during an inspection). The interior of the tank is relatively dark, so the target is designed to emit its own light so that it can be seen. The target is connected to the ROV in order to match its movement. The camera and level sensor are connected to a computer which sits on the roof of the tank. The computer processes the camera images, extracts position and communicates it to the operators who sit at the tank base.
[0079] The various embodiments described above are specific examples of a single broader invention. Any modifications, alterations or the equivalents of the above-mentioned embodiments pertain to the same invention as long as they are not falling beyond the scope of the invention as defined by the appended claims. It will be apparent to a person skilled in the art that the method and system for determining 2D localization of a target in confined spaces may be provided using some or many of the above-mentioned features or components without departing from the scope of the invention. It will be also apparent to a skilled person that the embodiments described above are specific examples of a single broader invention which may have greater scope than any of the singular descriptions taught. There may be many alterations made in the invention without departing from the spirit and scope of the invention.
[0080] Figures are merely representational and are not drawn to scale. Certain portions thereof may be exaggerated, while others may be minimized. Figures illustrate various embodiments of the invention that can be understood and appropriately carried out by those of ordinary skill in the art.
[0081] In the foregoing detailed description of embodiments of the invention, various features are grouped together in a single embodiment for the purpose of streamlining the disclosure. This method of disclosure is not to be interpreted as reflecting an intention that the claimed embodiments of the invention require more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive subject matter lies in less than all features of a single disclosed embodiment. Thus, the following claims are hereby incorporated into the detailed description of embodiments of the invention, with each claim standing on its own as a separate embodiment.
[0082] It is understood that the above description is intended to be illustrative, and not restrictive. It is intended to cover all alternatives, modifications and equivalents as may be included within the spirit and scope of the invention as defined in the appended claims. Many other embodiments will be apparent to those of skill in the art upon reviewing the above description. The scope of the invention should, therefore, be determined with reference to the appended claims, along with the full scope of equivalents to which such claims are entitled. In the appended claims, the terms “including” and “in which” are used as the plain-English equivalents of the respective terms “comprising” and “wherein,” respectively.
, Claims:
1. A system (100) for determining 2D localization of a target in confined spaces, the system comprising:
at least one image acquisition device (110) configured to capture an image of an area of interest of covering the target on a plane surface;
at least one pose control device (120) configured to measure and adjust a position and an orientation of the image acquisition device relative to the plane surface to identify and capture the target in the area of interest; and
at least one processing device (130) in communication with the image acquisition device (110) and the pose control device (120), and configured to receive the captured image of the target from the image acquisition device and measured position and the orientation of the image acquisition device from the pose control device to determine the 2D localization of the target.

2. The system as claimed in claim 1, further comprising at least one mounting device (140) configured to hold the pose control device and the image acquisition device.

3. The system as claimed in claim 1, wherein the image acquisition device (110) comprises a plurality of integrated sensors configured to monitor and capture the images of the target in different spectra, which moves on the plane surface.
4. The system as claimed in claim 3, wherein the target is a stationary or a non-stationary target, varying in size, shape, appearance and intrinsic abilities.

5. The system as claimed in claim 1, wherein the pose control device (120) comprises at least one of inclinometers, accelerometers, gyroscopes, non-contact distance sensors and inertial measurement units to measure the position and the orientation of the image acquisition device.

6. The system as claimed in claim 1, wherein the pose control device (120) comprises at least one of integrated motors and actuators to adjust the position and the orientation of the image acquisition device from the plane surface.

7. The system as claimed in claim 6, wherein the actuators control the distance between the image acquisition device and the area of interest, and the integrated motors control angular movements of the image acquisition device.

8. The system as claimed in claim 1, wherein the pose control device (120) is configured to keep the target in view of the image acquisition device, by changing the pose of the image acquisition device based on the movement of the target.

9. The system as claimed in claim 1, wherein the processing device (130) is configured to determine the 2D localization information of the target on the plane surface based on the position and the orientation information of the image acquisition device relative to the plane surface and a pixel location of the target in the image.

10. The system as claimed in claim 2, wherein the mounting device (140) and pose control device (120) are configured to provide the required degrees of freedom for the image acquisition device.

11. The system as claimed in claim 1, wherein the mounting device (140) is configured to hold the pose control device and plurality of image acquisition devices.

12. The system as claimed in claim 1, wherein the image acquisition device (110) and the pose control device (120) communicate with the processing device (130) through wired or wireless communication.

13. A method (500) for determining 2D localization of a target in confined spaces, the method comprising:
capturing (510), by at least one image acquisition device, an image of an area of interest of covering the target on a plane surface;
measuring and adjusting (520), by at least one pose control device, a position and an orientation of the image acquisition device relative to the plane surface to identify and capture the target in the area of interest;
receiving (530), by at least one processing device, the captured image of the target from the image acquisition device and measured position and the orientation of the image acquisition device from the pose control device; and
determining (540), by at least one processing device, the 2D localization of the target based on the received image and measured position and the orientation of the image acquisition device.

14. The method as claimed in claim 13, further comprising calibrating, prior to deployment, the image acquisition device, pose control device and the processing device to determine intrinsic parameters of devices.

Documents

Application Documents

# Name Date
1 202441043379-STATEMENT OF UNDERTAKING (FORM 3) [04-06-2024(online)].pdf 2024-06-04
2 202441043379-OTHERS [04-06-2024(online)].pdf 2024-06-04
3 202441043379-FORM FOR STARTUP [04-06-2024(online)].pdf 2024-06-04
4 202441043379-FORM FOR SMALL ENTITY(FORM-28) [04-06-2024(online)].pdf 2024-06-04
5 202441043379-FORM 1 [04-06-2024(online)].pdf 2024-06-04
6 202441043379-FIGURE OF ABSTRACT [04-06-2024(online)].pdf 2024-06-04
7 202441043379-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [04-06-2024(online)].pdf 2024-06-04
8 202441043379-DRAWINGS [04-06-2024(online)].pdf 2024-06-04
9 202441043379-DECLARATION OF INVENTORSHIP (FORM 5) [04-06-2024(online)].pdf 2024-06-04
10 202441043379-COMPLETE SPECIFICATION [04-06-2024(online)].pdf 2024-06-04
11 202441043379-Proof of Right [24-06-2024(online)].pdf 2024-06-24
12 202441043379-FORM-26 [24-06-2024(online)].pdf 2024-06-24
13 202441043379-FORM-9 [29-06-2024(online)].pdf 2024-06-29
14 202441043379-STARTUP [02-07-2024(online)].pdf 2024-07-02
15 202441043379-FORM28 [02-07-2024(online)].pdf 2024-07-02
16 202441043379-FORM 18A [02-07-2024(online)].pdf 2024-07-02
17 202441043379-FER.pdf 2025-05-28
18 202441043379-FER_SER_REPLY [08-09-2025(online)].pdf 2025-09-08
19 202441043379-DRAWING [08-09-2025(online)].pdf 2025-09-08
20 202441043379-COMPLETE SPECIFICATION [08-09-2025(online)].pdf 2025-09-08
21 202441043379-CLAIMS [08-09-2025(online)].pdf 2025-09-08

Search Strategy

1 202441043379_SearchStrategyNew_E_SearchHistory(28)E_28-05-2025.pdf