Sign In to Follow Application
View All Documents & Correspondence

Systems And Methods For Detection Of Precision Graspable Affordances In Cluttered Environment

Abstract: This disclosure relates generally to detection of graspable regions and grasp poses together referred as graspable affordances from a single view, at least a partial 3D point cloud without any apriori knowledge of the geometry of objects to be picked in a cluttered environment. Conventional methods rely on availability of accurate geometric information of the objects, presume the objects are isolated, or lack desired accuracy. In the present disclosure, various object surfaces are identified and then suitable grasping handles on the object surfaces are identified by applying geometric constraints of a gripper to be used. A modified region growing algorithm using a pair of thresholds for smoothness constraint along with detection of edge points is used to find natural boundaries of object surfaces. A 6D pose detection problem is converted into a 1D linear search problem by projecting 3D cloud points onto principal axes of the object surfaces.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
11 May 2018
Publication Number
46/2019
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
ip@legasis.in
Parent Application
Patent Number
Legal Status
Grant Date
2024-03-12
Renewal Date

Applicants

Tata Consultancy Services Limited
Nirmal Building, 9th Floor, Nariman Point, Mumbai - 400021, Maharashtra, India

Inventors

1. KUNDU, Olyvia
Tata Consultancy Services Limited, Gopalan Global Axis, SEZ "H" Block, No. 152 (Sy No. 147,157 & 158), Hoody Village, Bangalore - 560066, Karnataka, India
2. KUMAR, Swagat
Tata Consultancy Services Limited, Gopalan Global Axis, SEZ "H" Block, No. 152 (Sy No. 147,157 & 158), Hoody Village, Bangalore - 560066, Karnataka, India

Specification

Claims: A processor implemented method (200) comprising: identifying a plurality of surfaces associated with objects in a 3D point cloud of an environment under consideration using smoothness of surface normals to identify natural boundaries of the objects, the 3D point cloud being obtained from a single view (202), the step of identifying a plurality of surfaces comprises iteratively performing for each seed point in the 3D point cloud the following steps: identifying, in the 3D point cloud, a current seed point with a spherical neighborhood having a plurality of points representing neighboring points for the current seed point (202a); obtaining a pair of thresholds representing an upper limit (?_high) and a lower limit ?(??_low)for defining a current region which forms part of at least one of the plurality of surfaces (202b); computing a smoothness criterion for each of the neighboring points of the current seed point, wherein the smoothness criterion is an angle between the surface normals of the current seed point and each of the neighboring points (202c); including to the current region, the neighboring points of the current seed point having the angle representing the smoothness criterion less than the defined lower limit, and thereby growing the current region to form at least one of the plurality of surfaces (202d); excluding from the current region, the neighboring points of the current seed point having the angle representing the smoothness criterion more than the defined upper limit (202e); and including to a list of seed points to be used in a next iteration, the neighboring points (i) having the angle representing the smoothness criterion between the lower limit and the upper limit and (ii) identified as non-edge points (202f) thereby terminating the iterative steps (202a through 202e) at an edge of a corresponding surface from the plurality of surfaces. The processor implemented method of claim 1, wherein the neighboring points are identified as edge points if a ratio C_R/m>k;0k;0k;0w to avoid collision with non-target objects while making a grasping manoeuvre. Each object surface is associated with three principal axes, namely, n ^ normal to the surface and two principal axes - major axis a ^ orthogonal to the plane of finger motion (gripper closing plane) and minor axis f ^ which is orthogonal to other two axes as shown in FIG.2B. The grasp pose detection method takes a 3D point cloud C ?R^3 and a geometric model of the gripper as input and produces a six-dimensional (6D) grasp pose handle H ?SE wherein SE represents a 6-DOF (degree of freedom) space. The 6D pose is represented by a vector p=[x,y,z,?_x,?_y,?_z ], where (x,y,z) is the point where a closing plane of the gripper and object surface seen by a robot camera intersect and (?_x,?_y,?_z ) is the orientation of the gripper handle with respect to a global coordinate frame. Searching for a suitable 6 DOF grasp pose is a computationally intensive task and hence in accordance with the present disclosure, the search space is reduced by applying several constraints. For instance, it is assumed that the gripper approaches the object along a plane which is orthogonal to the object surface seen by the robot camera. In other words, the closing plane of the gripper is normal to the object surface as shown in FIG.2B. Since the mean depth of the object surface is known, the pose detection problem becomes a search for three-dimensional (l×b×e) bands along the major axis a ^ where l is a minimum depth necessary for holding the object. Hence, the grasp pose detection becomes a one-dimensional search problem once the object surface is identified. In accordance with the present disclosure, the technical problem of detecting precision graspable affordances in cluttered environment is solved in two steps: (i) identifying surfaces in 3D point clouds and (ii) applying geometric constraints of a two finger parallel-jaw gripper to reduce the search space for finding suitable gripper pose. FIG.3A through FIG.3C illustrate exemplary flow charts for a computer implemented method for detection of precision graspable affordances in cluttered environment, in accordance with an embodiment of the present disclosure. In an embodiment, the system 100 includes one or more data storage devices or memory 102 operatively coupled to the one or more processors 104 and is configured to store instructions configured for execution of steps of the method 200 by the one or more processors 104. The steps of the method 200 will now be explained in detail with reference to the components of the system 100 of FIG.1. Although process steps, method steps, techniques or the like may be described in a sequential order, such processes, methods and techniques may be configured to work in alternate orders. In other words, any sequence or order of steps that may be described does not necessarily indicate a requirement that the steps be performed in that order. The steps of processes described herein may be performed in any order practical. Further, some steps may be performed simultaneously. Accordingly, in an embodiment of the present disclosure, the one or more processors 104 are configured to firstly identify, at step 202, a plurality of surfaces in the 3D point cloud of an environment under consideration, the 3D point cloud being obtained from a single view, using smoothness of surface normals to identify natural boundaries of objects in the environment. Furthermore, the one or more processors 104 are configured to detect in real time, graspable affordances, at step 204, in the identified plurality of surfaces for each of the objects in the environment by applying geometric constraints in the form of physical constraints of a gripper to be used for gripping the objects in the environment and constraints of the environment. In accordance with an embodiment of the present disclosure, step 202 involves identifying several surface patches in the 3D point cloud using a modified region growing algorithm. The angle between surface normals is taken as a smoothness criterion and is denoted by symbol ?. In accordance with the conventional region growing algorithm known in the art, the step starts from one seed point and the points in its neighborhood are added to the current region (or label) if the angle between the surface normals of a neighboring point and that of the seed point is less than a user-defined threshold. The procedure is repeated with all the neighboring points as new seed points until all points have been labeled to one region or the other. The quality of segmentation heavily depends on the choice of the user-defined threshold value. A very low value may lead to over segmentation and a very high value may lead to under segmentation. The presence of sensor noise further exacerbates this problem leading to spurious edges when only one threshold is used. This limitation of the conventional region growing algorithm is overcome in the method of the present disclosure by defining edge points and using a pair of thresholds instead of one. FIG.4 illustrates an edge point defined in accordance with an embodiment of the present disclosure. Consider a current seed point s ? C with its own spherical neighbourhood N(s) shown as a circle in FIG.4. It is further assumed that the neighbourhood consists of m points (p_i,i=1,2,….,m) in the 3D point cloud. Mathematically, the neighbourhood may be represented as follows: N(s)={p_i? C|?s-p_i ?=r};i=1,2,…m (1) where r is a user-defined radius of the spherical neighborhood. In accordance with an embodiment of the present disclosure, the one or more processors 104 are configured to identify in the 3D point cloud, at step 202a, the current seed point with a spherical neighborhood having a plurality of points representing neighboring points for the current seed point. Each neighboring point p_i has an associated surface normal N_i which makes an angle ?_i with the normal associated with the seed N_s. As stated earlier, ?_i is the smoothness criterion for the modified region growing algorithm of the present disclosure. Also, in accordance with the present disclosure, the one or more processors 104 are configured to obtain, at step 202b, a pair of thresholds representing a lower limit ?_low and an upper limit ?_highfor defining a current region for the neighboring point which forms part of at least one of the plurality of surfaces and creating new seed points for further propagation. In an embodiment, the pair of thresholds representing the upper limit and the lower limit is defined empirically. In accordance with the present disclosure, the one or more processors 104 are configured to compute, at step 202c, the smoothness criterion for each of the neighboring points of the current seed point. Let Q_s be a set of new seed points which may be used in a next iteration of the modified region growing algorithm and R_s be a set of neighboring points p_i of the current seed point s for which ?_i> ?_high. R(s)={p_i? N(s)|?_i> ?_high,i=1,2,…m} (2) In accordance with the present disclosure, a neighboring point is identified as an edge point, if C_R/m>k;0 ?_high. FIG.5A through FIG.5D illustrate a modified region growing algorithm in accordance with an embodiment of the present disclosure. FIG.5A particularly shows, an edge point as point B. An edge point is different from a non-edge point in the sense that the later lies away from an edge and its neighbors have surface normals more or less in the same direction. One such non-edge point is shown as point A in FIG.5A. Even with sensor noise, the neighboring points around such a seed point will have surface normals with smaller values of angles with respect to the surface normal of the seed point, i.e., ?_i< ?_high. In accordance with the present disclosure, the modified region growing algorithm starts with the current seed point s and the region L{p_i } for a neighboring point p_i? N(s) is defined as follows: if ?_i< ?_low; then, L{p_i }=L{s}? p_i? Q_s if ?_i> ?_high; then, L{p_i }?L{s}? p_i? Q_s (4) where the notation p_i? Q_sindicates that the point p_i is added to the list of seed points which may be used by the modified region growing algorithm in a next iteration. However, if the angle between normals lies between the two thresholds, i.e. ??_low

Documents

Application Documents

# Name Date
1 201821017851-STATEMENT OF UNDERTAKING (FORM 3) [11-05-2018(online)].pdf 2018-05-11
2 201821017851-REQUEST FOR EXAMINATION (FORM-18) [11-05-2018(online)].pdf 2018-05-11
3 201821017851-FORM 18 [11-05-2018(online)].pdf 2018-05-11
4 201821017851-FORM 1 [11-05-2018(online)].pdf 2018-05-11
5 201821017851-FIGURE OF ABSTRACT [11-05-2018(online)].jpg 2018-05-11
6 201821017851-DRAWINGS [11-05-2018(online)].pdf 2018-05-11
7 201821017851-COMPLETE SPECIFICATION [11-05-2018(online)].pdf 2018-05-11
8 201821017851-Proof of Right (MANDATORY) [23-05-2018(online)].pdf 2018-05-23
9 201821017851-FORM-26 [12-07-2018(online)].pdf 2018-07-12
10 Abstract1.jpg 2018-08-11
11 201821017851- ORIGINAL UR 6( 1A) FORM 1-300518.pdf 2018-08-16
12 201821017851-OTHERS(ORIGINAL UR 6(1A) FORM 26)-160718.pdf 2018-11-14
13 201821017851-OTHERS [09-07-2021(online)].pdf 2021-07-09
14 201821017851-FER_SER_REPLY [09-07-2021(online)].pdf 2021-07-09
15 201821017851-COMPLETE SPECIFICATION [09-07-2021(online)].pdf 2021-07-09
16 201821017851-CLAIMS [09-07-2021(online)].pdf 2021-07-09
17 201821017851-FER.pdf 2021-10-18
18 201821017851-US(14)-HearingNotice-(HearingDate-23-02-2024).pdf 2024-02-06
19 201821017851-FORM-26 [21-02-2024(online)].pdf 2024-02-21
20 201821017851-Correspondence to notify the Controller [21-02-2024(online)].pdf 2024-02-21
21 201821017851-Written submissions and relevant documents [08-03-2024(online)].pdf 2024-03-08
22 201821017851-RELEVANT DOCUMENTS [08-03-2024(online)].pdf 2024-03-08
23 201821017851-MARKED COPIES OF AMENDEMENTS [08-03-2024(online)].pdf 2024-03-08
24 201821017851-FORM 13 [08-03-2024(online)].pdf 2024-03-08
25 201821017851-Annexure [08-03-2024(online)].pdf 2024-03-08
26 201821017851-AMMENDED DOCUMENTS [08-03-2024(online)].pdf 2024-03-08
27 201821017851-PatentCertificate12-03-2024.pdf 2024-03-12
28 201821017851-IntimationOfGrant12-03-2024.pdf 2024-03-12

Search Strategy

1 2021-01-0415-25-20E_06-01-2021.pdf

ERegister / Renewals

3rd: 09 May 2024

From 11/05/2020 - To 11/05/2021

4th: 09 May 2024

From 11/05/2021 - To 11/05/2022

5th: 09 May 2024

From 11/05/2022 - To 11/05/2023

6th: 09 May 2024

From 11/05/2023 - To 11/05/2024

7th: 09 May 2024

From 11/05/2024 - To 11/05/2025

8th: 02 May 2025

From 11/05/2025 - To 11/05/2026