Sign In to Follow Application
View All Documents & Correspondence

Autonomous Indoor Exploration Using Mobile Vehicle And Simultaneous Mapping Of Indoor Environment

Abstract: Disclosed is a computer-implemented method and device for autonomous exploration, localizing and mapping by a mobile vehicle. The said method comprising: Scanning the environment to generate sensor data, using a plurality of sensors; Estimating a circular arc from the said sensor data to map the free regions and the obstacle regions for navigation by vehicle; Generating, from the circular arc, a plurality of preliminary estimate of the next positions of the said vehicle in the said free region; Calculating heuristically the next best target point in planar space; Moving the mobile vehicle to the target position and performing autonomous localization ad mapping at the new position; Repeating the sequence of steps to explore the navigable space and generating a map of environment is also generated simultaneously. Fig. 1

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
14 December 2016
Publication Number
24/2018
Publication Type
INA
Invention Field
MECHANICAL ENGINEERING
Status
Email
patents@ssrana.com
Parent Application

Applicants

AGARWAL, Deepak
117 Crema Block, Mahagun Mansion, Phase 1, Indirapuram, Ghaziabad- 201010 Uttar Pradesh, India

Inventors

1. AGARWAL, Deepak
117 Crema Block, Mahagun Mansion, Phase 1, Indirapuram, Ghaziabad- 201010 Uttar Pradesh, India

Specification

Description:FIELD OF INVENTION
[0001] The present invention relates generally to a method of controlling a mobile robot, and more particularly, to a solution for simultaneous localization, mapping and accurately determining the next position of an unmanned system, such as in mobile robots.

BACKGROUND OF THE INVENTION
[0002] The field of robotics has greatly expanded to include disaster detection robots, home service robots, escort robots, military robots etc. As an autonomous mobile robot is a highly intelligent body, with flexible and powerful athleticism observation equipment, the use of the robot map building capabilities in place of personnel to go to a dangerous environment to explore the unknown, it has become an effective means of rescue work.

[0003] Mobile robots with a capability to localize and map the environment are common in the society for a variety of applications. Localization techniques assist a robot to determine its position with respect to its environment. For example, in a “pure” localization system, the robot is provided with a map of its surroundings. Such “pure” localization systems are disadvantageous because generating a map via manual techniques is a relatively difficult, labour-intensive, and specialized task. Moreover, many environments are not static. As a result, maps in pure localization systems are subject to relatively frequent and costly updates such that the map accurately represents its surroundings. Further, mapping techniques relate to processes by which a robot builds a map of its surroundings. A robot that can autonomously build a map of its surroundings and can localize itself within the map can advantageously exhibit a relatively high degree of autonomy. Moreover, such a robot can advantageously adapt to changes in its surroundings. This process of building a map and using the generated map to localize itself is known as Simultaneous Localization and Mapping (SLAM).

[0004] Conventional SLAM techniques exhibit relatively many disadvantages. For example, one conventional SLAM technique builds a map using a laser rangefinder. Such laser rangefinder techniques, although accurate, are relatively unreliable in dynamic environments such as environments where people are walking or objects are changing location. In addition, a laser rangefinder is a costly infrastructure for many day-to-day robot applications.

[0005] Another conventional SLAM technique uses an Expectation Maximization (EM) algorithm to generate a map. Disadvantageously, such EM techniques are computationally intensive and are difficult to compute in real time.

[0006] Yet another conventional technique for localization uses visual images taken from a 3-camera array. Disadvantageously, low-production volume multiple cameras cost substantially more than mass-produced single cameras. Moreover, conventional techniques utilizing visual images are not capable of autonomous mapping.

[0007] Further, these known mobile robots are incapable to reliably explore the future position of the unmanned robot in a dynamic environment.

[0008] In prior art documents, EP2856273 discloses a solution for estimating the pose of a platform, such as a vehicle. Data from a plurality of types of sensing devices located on the platform can be used to independently calculate a plurality of preliminary estimates corresponding to the pose. A plurality of estimates corresponding to the pose can be generated using the preliminary estimates and at least one covariance matrix. One or more entries in the covariance matrix are adjusted based on an uncertainty for the corresponding preliminary estimate. The uncertainty can vary based on time, distance, and/or velocity of the platform.

[0009] Although, the said patent solves the problem of localizing and mapping in an indoor environment, however it does not assist the mobile vehicle to effectively navigate the indoor environment without colliding with the obstacles. Thus, the said patent is unable to address the problem of exploration i.e. autonomously navigating the space without any human intervention.

[0010] Current approaches frequently combine, or fuse, the results of two or more pose information acquisition methods to provide more reliable overall position estimation for an unmanned vehicle. These current approaches assume that combining the results of multiple data acquisition approaches will allow the combination to significantly reduce the overall uncertainty in the resulting position exploration; however, the certainty is still not met. For example, iRobot uses visual SLAM for their robot vacuum cleaners for indoor mapping and robot localization. The main drawback is that the said robot cannot decide which position it should visit next.

[0011] Thus, in view of the above limitations of the existing approaches, it is required to develop method and mobile robot that may autonomously explore the indoor surroundings and decide the next position of the mobile unmanned robot, while simultaneously mapping the indoor environment and localizing the vehicle.

[0012] Thus, the above-described deficiencies of existing methods and systems thereof, for autonomous exploration while mapping and localization are merely intended to provide an overview of some of the problems of conventional methods, and are not intended to be exhaustive. Other problems with conventional methods and systems and their corresponding benefits of the various non-limiting embodiments described herein may become further apparent upon review of the following description

SUMMARY OF INVENTION
[0013] The following presents a simplified summary of the invention to provide a basic understanding of some aspects of the invention. This summary is not an extensive overview of the present invention. It is not intended to identify the key/critical elements of the invention or to delineate the scope of the invention. Its sole purpose is to present some concept of the invention in a simplified form as a prelude to a more detailed description of the invention presented later.

[0014] It is therefore an object of the present invention to overcome the drawbacks of the prior art.

[0015] It is another object of the present invention is to provide a solution for estimating the position of a mobile vehicle in an indoor environment.

[0016] It is another object of the present invention to provide a computer implemented system comprising: a set of modules configured to perform a method comprising periodically calculating a plurality of estimates corresponding to a position of a robot.

[0017] It is yet another object of the present invention to provide a method for autonomous estimation of new position of a mobile vehicle in an indoor environment, while simultaneously mapping the environment and localizing the vehicle in it.

[0018] It is still another object of the present invention to provide a mobile vehicle capable to autonomously explore the new position of the vehicle in an indoor environment, while simultaneously mapping the environment and localizing itself.

[0019] Accordingly, in an aspect the present invention provides a computer-implemented method of autonomous exploration, localizing and mapping by a mobile vehicle. The said method comprising: scanning the environment to generate sensor data, using a plurality of sensors;
estimating a circular arc from the said sensor data to map the free regions and the obstacle regions for navigation by vehicle; generating, from the circular arc, a plurality of preliminary estimate of the next positions of the said vehicle in the said free region; calculating heuristically the next best target point in planar space; moving the mobile vehicle to the target position and performing autonomous localization and mapping at the new position. At the new position the same sequence of steps will be repeated till the entire navigable space is explored. When the entire navigable space is covered the map of environment is also generated simultaneously.

[0020] In another aspect the present invention provides, a mobile electronic vehicle capable to autonomously explore, localize and map an environment by the method as described herein. The said vehicle comprising: an image capturing means configured to scan and acquire a plurality of image data of the environment; and a navigation system, the navigation system comprises: an exploration module configured to generate a circular arc to determine freely navigable spaces around the mobile vehicle using laser scanner or ultrasonic sensor and decide the next direction of vehicle movement to maximize the area covered in least number of exploration steps; and a SLAM module configured to update an estimate of the vehicle location and the locations of existing landmarks and generated landmark while navigating and to navigate the same, vehicle with the sensor data based on the updated estimate of the vehicle location, said update occurring when a landmark stored in the database is visually observed.

[0021] Other aspects, advantages, and salient features of the invention will become apparent to those skilled in the art from the following detailed description, which, taken in conjunction with the annexed drawings, discloses exemplary embodiments of the invention.

BRIEF DESCRIPTION OF THE DRAWINGS
[0022] The above and other aspects, features, and advantages of certain exemplary embodiments of the present invention will be more apparent from the following description taken in conjunction with the accompanying drawings in which:

[0023] Figure 1 illustrates the Local Safe Region as per the sensor perception strategy, in accordance with an embodiment of the present subject matter.

[0024] Figure 2 illustrates the Safe region as shown in grey as calculated by summing up all local free regions, in accordance with an embodiment of the present subject matter.

[0025] Figure3 illustrates the obstacle, free space, frontier arcs, and the computation of the arcs, in accordance with an embodiment of the present subject matter.

[0026] Figure 4 illustrates the 3D map generated by the method, in accordance with an embodiment of the present subject matter.

[0027] Figure 5 illustrates the method steps configuration of the mobile vehicle, in accordance with an embodiment of the present subject matter.

[0028] Persons skilled in the art will appreciate that elements in the figures are illustrated for simplicity and clarity and may have not been drawn to scale. For example, the dimensions of some of the elements in the figure may be exaggerated relative to other elements to help improve understanding of various exemplary embodiments of the present disclosure. Throughout the drawings, it should be noted that like reference numbers are used to depict the same or similar elements, features, and structures.

DETAILED DESCRIPTION OF THE PRESENT INVENTION
[0029] The following description with reference to the accompanying drawings is provided to assist in a comprehensive understanding of exemplary embodiments of the invention. It includes various specific details to assist in that understanding but these are to be regarded as merely exemplary.

[0030] Accordingly, those of ordinary skill in the art will recognize that various changes and modifications of the embodiments described herein can be made without departing from the scope of the invention. In addition, descriptions of well-known functions and constructions are omitted for clarity and conciseness.

[0031] The terms and words used in the following description and claims are not limited to the bibliographical meanings, but, are merely used by the inventor to enable a clear and consistent understanding of the invention. Accordingly, it should be apparent to those skilled in the art that the following description of exemplary embodiments of the present invention are provided for illustration purpose only and not for the purpose of limiting the invention as defined by the appended claims and their equivalents.

[0032] It is to be understood that the singular forms “a,” “an,” and “the” include plural referents unless the context clearly dictates otherwise.

[0033] By the term “substantially” it is meant that the recited characteristic, parameter, or value need not be achieved exactly, but that deviations or variations, including for example, tolerances, measurement error, measurement accuracy limitations and other factors known to those of skill in the art, may occur in amounts that do not preclude the effect the characteristic was intended to provide.

[0034] Features that are described and/or illustrated with respect to one embodiment may be used in the same way or in a similar way in one or more other embodiments and/or in combination with or instead of the features of the other embodiments.

[0035] It should be emphasized that the term “comprises/comprising” when used in this specification is taken to specify the presence of stated features, integers, steps or components but does not preclude the presence or addition of one or more other features, integers, steps, components or groups thereof.

[0036] The present invention relates to a method for estimating the position of an unmanned system, which is travelling on ground. The invention provides a more certain and reliable future position estimation in autonomous exploration of an indoor space.

[0037] The technical problems addressed by the present invention in autonomous exploration of an indoor space which is bounded by walls, obstacles etc. are as follows:

[0038] Firstly, to know where to go i.e. EXPLORATION: In absence of any information of indoor Environment (topology) it is difficult to judge where to navigate. The invention provides a method to find unique space point where the vehicle has to move so that it does not collide with any obstacles and at the same time ensure that the space point is not revisited before.

[0039] Secondly, to know where the robot is i.e. LOCALIZATION: To autonomously navigate an indoor environment, it is imperative for navigation vehicle to estimate its position with respect to the indoor environment. The invention provides a method to estimate its position with respect to indoor environment.

[0040] Thirdly, to map the indoor space i.e. MAPPING: To know where the robot is, it is important that vehicle knows unique landmarks/identification points within the indoor environment that is used by the vehicle to identify its position with respect to those landmarks. The invention provides a method for the vehicle to map the indoor environment.

[0041] Thus, the present invention provides a solution method and system to figure out where the vehicle should move in indoor space so that it maps the environment. Once the vehicle moves to a new location, indoor vehicle scans the environment and estimates its position with respect to the indoor environment. Further, vehicle keeps track of places it has already covered. As the objective of robot is exploration it figures out which space point to move, to cover maximum navigable space. This process is repeated until the entire navigable space is covered by the vehicle.

[0042] In the present invention a solution for estimating the next/new position of an unmanned mobile vehicle, such as a robot in a dynamic environment is provided. Solving the problem of exploration is critical to autonomously navigate the space without any human intervention. The present invention, builds either 2D or 3D map of an indoor space depending on type of sensor used to map the space. Moreover, the present method performs localizing and mapping independently from exploration problem. Depending on the sensor types used for application the configuration of the processor of the vehicle may be adapted.

[0043] In an implementation, the vehicle maps the space around it, by sensing the space using laser scanner, to figure the presence of free regions and areas it cannot navigate due to obstacles. Referring to figure 1, which shows the Local Safe Region as per the sensor perception strategy. The robot is the circular body located at the centre of the scene. The free region is obtained by defining a ball whose radius r is the minimum range reading as shown in Figure. 1. In this case r may be the distance to the closest obstacle or, in wide open areas, the maximum range of the available sensors. The ball radius is the same value r for any direction. The Safe Region as shown in Figure 2, which is defined as the freely navigable area in indoor space, is built as the union of balls of different sizes collected at different points in free space.

[0044] Once the local safe region is calculated as stated above, mobile vehicle has to estimate the direction it has to move next to explore the unknown area. This is done using frontier arcs. A procedure for identifying frontier arcs is as illustrated in figure 3. One sample point is placed along the axis of each rangefinder at a distance r (the radius of the ball). Each point is classified as obstacle if the corresponding range measure is r, as free if it falls in the ball of another node belonging to the tree, and as frontier otherwise. The resulting classification is propagated to the elementary arc represented by the point, i.e., the arc resulting from the intersection of the ball with the corresponding sensor cone. The final frontier, free, and obstacle arcs are obtained by joining contiguous elementary boundary portions that have been classified as such.

[0045] The arcs marked by frontier arcs are free regions where vehicle moves without colliding with obstacles. Vehicle will choose direction, which is heuristically calculated, in frontier arc where it may move. The vehicle will move to next location and again scan the environment around it. The grey patch in Figure 1& 2 are identified as free zone where vehicle movement is possible without any obstacles. The white patch in Figure 1 & 2 is the unexplored regions where the robot must navigate to explore the space.

[0046] In an implementation, the vehicle develops grey patches as it moves along. Grey signifies the space is free and space that is freely navigable. At each of these points where the vehicle is scanning the environment, vehicle sensor system is also developing the 2D or 3D map (Figure 4) of the environment and at the same time it is also localizing the vehicle with respect to previous landmarks/feature already stored in vehicle/robot memory.

[0047] In an implementation, the vehicle may use different sensors to achieve the objective of locating a vehicle in space. For example, following sensors alone or in combinations may be used: Wheel encoders (odometry) with laser scanner (2D map); Wheel encoders (odometry) with laser scanner + camera (3D map).
[0048] In a preferred implementation, the computer-implemented method of autonomous exploration, localizing and mapping by a mobile vehicle, comprises: Scanning the environment, using a plurality of sensors; Performing tasks of, exploration, localization and mapping autonomously on mobile vehicle by fusing data from multiple sensors using Kalman filter; Calculating a circular arc area from the said sensor data to map the free regions and the obstacle regions for navigation by vehicle; Generating from the arc area, preliminary estimate of the next position of the said vehicle in the said free region; Moving the said vehicle to the preferred next position, mapping the surrounding environment at the new position of the vehicle and finally localizing the vehicle with respect to landmarks already seen by the vehicle before; wherein exploring the next position of the mobile vehicle depends on the generation of sensor data to form 2D or 3D maps of an environment.

[0049] In an implementation, the mobile electronic vehicle capable to autonomously explore, localize and map of an environment comprises: an image capturing means configured to scan and acquire a plurality of image data of the environment; and a navigation system. The navigation system comprises: a module configured to generate a frontier arc to determine freely navigable spaces around the mobile vehicle using laser scanner or ultrasonic sensor; a module configured to store the generated landmarks in a database comprising one or more existing landmarks; and a module configured to update an estimate of the vehicle location and the locations of the existing landmarks and the generated landmark while navigating and to navigate the same, vehicle with the sensor data based on the updated estimate of the vehicle location, said update occurring when a landmark stored in the database is visually observed.

[0050] Referring to figure 5, which illustrates the method steps configured in the modules of mobile vehicle. The two modules: Exploration module: is a sensor-based exploration technique of unknown environments by a mobile robot. The module is based on the randomized incremental generation of a data structure called Sensor-based Random Tree (SRT), which represents a roadmap of the explored area with an associated safe region. The objective of this module is once local safe region is calculated, with no obstacles, then the module will heuristically decide in which direction vehicle must move to maximize the area covered in least number of exploration steps; and SLAM module. SLAM stands for simultaneous localization and mapping. SLAM module is concerned with the problem of building a map of an unknown environment by a mobile vehicle while at the same time navigating the environment using the map. SLAM consists of multiple parts; Landmark extraction, data association, state estimation, state update and landmark update. SLAM module performs SLAM using the Extended Kalman Filter technique, wherein multiple sensor data is fused to get the best estimate of the vehicle position with respect to landmarks observed. This module with instructions configured to update an estimate of the vehicle location and the locations of the existing landmarks and the generated landmark while navigating and to navigate the same vehicle with the sensor data based on the updated estimate of the vehicle location, said update occurring when a landmark stored in the database is visually observed.

[0051] The way 2 modules work together is exploration module estimates the next best location in planar space of the mobile vehicle while the SLAM module will perform SLAM at the target location.

[0052] Apart from what is disclosed above, the present invention also includes some additional benefits and advantages. Few of the additional benefits are mentioned below:
• Autonomous handling of warehousing tasks such as sorting, picking up pellets at specified locations.
• Autonomous handling of house cleaning using mobile robotic vacuum cleaner.
• Autonomous exploration and simultaneous mapping of hazards places.
• Autonomous inspection of under constructed buildings.

[0053] The foregoing description of various aspects of the invention has been presented for purposes of illustration and description. It is not intended to be exhaustive or to limit the invention to the precise form disclosed, and obviously, many modifications and variations are possible. Such modifications and variations that may be apparent to an individual in the art are included within the scope of the invention as defined by the accompanying claims.

Claims:I CLAIM:
1) A computer-implemented method of autonomous exploration, localizing and mapping by a mobile vehicle, the said method comprising:
Scanning the environment to generate sensor data, using a plurality of sensors;
Estimating a circular arc from the said sensor data to map the free regions and the obstacle regions for navigation by vehicle;
Generating, from the circular arc, a plurality of preliminary estimate of the next positions of the said vehicle in the said free region;
Calculating heuristically the next best target point in planar space;
Moving the mobile vehicle to the target position and performing autonomous localization and mapping at the new position;
Repeating the sequence of steps to explore the navigable space and generating a map of environment is also generated simultaneously.

2) The method as claimed in claim 1, wherein navigating the vehicle to new position includes fusing the sensor data from multiple sensors to get the next navigable space in free space.

3) The method as claimed in claim 1, wherein the sensors are selected from laser scanners, wheel odometry, ultrasonic sensor, visual sensors or the combination thereof.

4) The method as claimed in claim 1, is capable to autonomously navigate the entire indoor environment and provide the map of the environment in 2D or 3D format depending on the nature of sensors used.

5) The method as claimed in claim 1, wherein the vehicle comprises a robot with a navigation system based on the existing landmarks and the generated landmark to assist in robot navigation.

6) The method as claimed in claim 3, wherein the visual sensor comprises at least an image capturing means.

7) The method as claimed in claim 1, maintains a record of the places it has already covered in a database and provide an indication of a new landmark that is not stored in database.

8) The method as claimed in claim 1, performs localizing and mapping independently from exploration.

9) A mobile electronic vehicle capable to autonomously explore, localize and map of an environment by the method as claimed in claim 1, said vehicle comprising:
an image capturing means configured to scan and acquire a plurality of image data of the environment; and
a navigation system, the navigation system comprises:
an exploration module configured to generate a circular arc to determine freely navigable spaces around the mobile vehicle using laser scanner or ultrasonic sensor and decide the direction vehicle must move to maximize the area covered in least number of exploration steps; and
a SLAM module configured to update an estimate of the vehicle location and the locations of existing landmarks and generated landmark while navigating and to navigate the same, vehicle with the sensor data based on the updated estimate of the vehicle location, said update occurring when a landmark stored in the database is visually observed.

10) The vehicle as claimed in claim 9, wherein the SLAM modules is flexible to pick and choose an existing SLAM module or write completely new SLAM module.

Documents

Application Documents

# Name Date
1 Form 5 [14-12-2016(online)].pdf 2016-12-14
2 Form 3 [14-12-2016(online)].pdf 2016-12-14
3 Drawing [14-12-2016(online)].pdf 2016-12-14
4 Description(Complete) [14-12-2016(online)].pdf_91.pdf 2016-12-14
5 Description(Complete) [14-12-2016(online)].pdf 2016-12-14
6 Other Patent Document [12-01-2017(online)].pdf 2017-01-12
7 201611042663-Power of Attorney-160117.pdf 2017-01-18
8 201611042663-Correspondence-160117.pdf 2017-01-18
9 abstract.jpg 2017-01-22
10 201611042663-FORM 18 [09-10-2018(online)].pdf 2018-10-09
11 201611042663-OTHERS [27-04-2021(online)].pdf 2021-04-27
12 201611042663-FER_SER_REPLY [27-04-2021(online)].pdf 2021-04-27
13 201611042663-COMPLETE SPECIFICATION [27-04-2021(online)].pdf 2021-04-27
14 201611042663-CLAIMS [27-04-2021(online)].pdf 2021-04-27
15 201611042663-FER.pdf 2021-10-17
16 201611042663-US(14)-HearingNotice-(HearingDate-04-12-2023).pdf 2023-11-08
17 201611042663-Correspondence to notify the Controller [28-11-2023(online)].pdf 2023-11-28

Search Strategy

1 201611042663_search_strategyE_22-08-2020.pdf