Sign In to Follow Application
View All Documents & Correspondence

Three Dimensional (3 D) Measurements Using Structured Lights (Laser Line)

Abstract: An apparatus for 3 dimensional measurements of an unknown (test) object adapted to move in an object plane, said apparatus comprising: laser triangulation setup with a laser line source to generate a laser plane; camera setup with a predetermined angle of camera and predetermined FOV (field of view) settings of the camera adapted to generate an image plane; calibration means adapted to perform calibration of said laser triangulation setup using a calibration object of a known size and known measurements in order to define an area bound by X-axis. Y-axis. Z-axis, said calibration means includes: video means to record movement of aid calibration object; X-axis calibration means adapted to calibrate said X-axis along said laser line with respect to said recorded video; Y-axis calibration means adapted to calibrate said Y-axis perpendicular to said laser line and parallel to motion of said unknown (test) object with respect to said recorded video; Z-axis calibration means adapted to calibrate said Z-axis along height of said unknown (test) object with respect to said recorded video; means to move said unknown (test) object along said calibrated Y-axis valid profile extraction means adapted to extract valid profiles of said unknown (test) object; height map generation means adapted to generate height map for said laser plane; width map generation means to generate width map for said laser plane; aligned line generation means adapted to generate an aligned line in vertical direction of said laser plane; camera distortion estimation means adapted to estimate camera distortion based on predefined parameters; and 3D cloud point generation means adapted to generate 3D cloud points for a test object under said calibrated setup using said generated height map, said generated width map, said aligned line, said calibrated axes, and said estimated camera distortion in order to determine dimensions of said unknown (test) object.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
30 October 2009
Publication Number
05/2012
Publication Type
INA
Invention Field
PHYSICS
Status
Email
Parent Application

Applicants

TATA CONSULTANCY SERVICES LTD.
NIRMAL BUILDING, 9TH FLOOR, NARIMAN POINT MUMBAI-400 021 MAHARASHTRA, INDIA

Inventors

1. SINHA ANIRUDDHA
BENGAL INTELLIGENT PARK, BUILDING-D, PLOT NO. A2 M2 & N2, BLOCK-EP,SALT LAKE ELECTRONICS COMPLEX, SECTOR-V, KOLKATA-700 091, WEST BENGAL, INDIA
2. BHOWMICK BROJESHWAR
BENGAL INTELLIGENT PARK, BUILDING-D, PLOT NO. A2 M2 & N2, BLOCK-EP,SALT LAKE ELECTRONICS COMPLEX, SECTOR-V, KOLKATA-700 091, WEST BENGAL, INDIA

Specification

FORM-2
THE PATENTS ACT, 1970
(39 of 1970)
&
THE PATENTS RULES, 2003
COMPLETE SPECIFICATION
(See section 10; rule 13)
THREE DIMENSIONAL (3D) MEASUREMENTS USING STRUCTURED LIGHTS (LASER LINE)
TATA CONSULTANCY SERVICES LIMITED,
an Indian Company of Nirmal Building,
9th Floor, Nariman Point, Mumbai - 400 021,
Maharashtra, India.
THE FOLLOWING SPECIFICATION PARTICULARLY DESCRIBES THE INVENTION AND THE MANNER IN WHICH IT IS TO BE PERFORMED

Field of the Invention:
This invention relates to the three dimensional (3D) measurements using structured lights (laser line).
Background of Invention
Measurement of objects in a 3 dimensional manner requires visualization and contouring from all possible angles, with adequate lighting, within a visible spectrum. This technique of scanning is useful in effective reconstruction for digital modeling.
Since a 3-dimensional object's profiles may vary in 3 dimensions, a multi-parameter tool may be required, one which is adapted to 'view' and 'record' the profiles of an object to be scanned and also the changes in profiles, in a seamless manner.
Some of the applications of this technology include industrial design, orthotics and prosthetics, reverse engineering and prototyping, quality control/inspection and documentation of cultural artifacts. Many simple and complex systems
Objects of the Invention:
An object of this invention is to provide a method and apparatus for the 3D measurements of objects using laser line source and a camera.
Another object of this invention is to provide a method and apparatus for allowing the user to create a laser triangulation using a laser line source, a camera and the target object.

Yet another object of this invention is to provide a method and apparatus for calibrating a plane using laser triangulation without any intrinsic parameters of the camera.
Still another object of this invention is to provide a method for calibrating a plane using laser triangulation setup with a simple triangular object.
An additional object of this invention is to provide a method for measuring the width, height, depth of an object and then generating the 3D cloud points.
Yet an additional object of this invention is to provide a method for generating the front, side and top views of an item.
Still an additional object of this invention is to provide a method for matching the measurement parameters of an object with the corresponding model object.
Summary of Invention:
According to this invention, there is envisaged an apparatus wherein a user is equipped with a laser triangulation setup with a laser line source where the angle of the camera and the FOV (field of view) of the camera are known.
Typically, this apparatus is used for quality control, counting of objects, segregation of objects and various industrial automation related applications.

For the purposes of this specification, an object plane is defined to include a real plane or a pre-defined area in which a test object is placed.
For the purposes of this specification, a laser plane is defined to include an area of the object plane as defined with respect to the laser line source and laser triangulation setup.
For the purposes of this specification, an image plane is defined to include an area of the object plane as defined with respect to the camera setup.
According to this invention, there is provided an apparatus for 3 dimensional measurements of an unknown (test) object adapted to move in an object plane, said apparatus comprises:
- laser triangulation setup with a laser line source to generate a laser plane;
- camera setup with a predetermined angle of camera and predetermined FOV (field of view) settings of the camera adapted to generate an image plane;
- calibration means adapted to perform calibration of said laser triangulation setup using a calibration object of a known size and known measurements in order to define an area bound by X-axis, Y-axis, Z-axis, said calibration means includes:
- video means to record movement of aid calibration object;
- X-axis calibration means adapted to calibrate said X-axis along said laser line with respect to said recorded video;

- Y-axis calibration means adapted to calibrate said Y-axis perpendicular to said laser line and parallel to motion of said unknown (test) object with respect to said recorded video;
- Z-axis calibration means adapted to calibrate said Z-axis along height of said unknown (test) object with respect to said recorded video;
- means to move said unknown (test) object along said calibrated Y-
axis
valid profile extraction means adapted to extract valid profiles of said unknown (test) object;
- height map generation means adapted to generate height map for said laser plane;
- width map generation means to generate width map for said laser plane;
- aligned line generation means adapted to generate an aligned line in vertical direction of said laser plane;
- camera distortion estimation means adapted to estimate camera distortion based on predefined parameters; and
- 3D cloud point generation means adapted to generate 3D cloud points for a test object under said calibrated setup using said generated height map, said generated width map, said aligned line, said calibrated axes, and said estimated camera distortion in order to determine dimensions of said unknown (test) object.
Typically, said calibration means includes height offset determination means adapted to determine height offset of said calibration object in said image plane with respect to said object plane.

Typically, said calibration means includes height determination means adapted to determine height of said unknown (test) object through said camera at predefined intervals of time.
Typically, said calibration means includes width determination means adapted to determine width of said unknown (test) object through said camera at predefined intervals of time.
Typically, said calibration means includes mapping means adapted to map said predetermined height and weight with said calibration object in said object plane
Typically, said height map generation means includes linear height map
generation means.
Typically, said width map generation means includes linear width map
generation means.
Typically, said height map generation means includes angular height map
generation means.
Typically, said width map generation means includes angular width map
generation means.
Typically, said height map generation means includes angular height map
generation means in pixel units.
Typically, said width map generation means includes angular width map
generation means in pixel units.
Typically, said height map generation means includes angular height map
generation means in measurement units.

Typically, said width map generation means includes angular width map generation means in measurement units.
Typically, said calibration means includes generation means adapted to generate said height map and said weight map for said laser plane for given triangulation setup by using inclination angle of said camera with respect to said horizontal plane and said predetermined FOV (field of view) settings of said camera.
Typically, said system includes mapping means adapted to map baseline of said object plane with respect to said image plane and said unknown (test) object in said image plane.
Typically, said system includes mapping means adapted to map image points from image plane of said unknown (test) object with respect to said laser plane.
Typically, said system includes matching means adapted to perform matching of said unknown (test) object with a corresponding calibration model object for quality control for recognizing said unknown (test) object upon passing through said determined object plane defined by said laser plane and image plane.
In particular this invention brings value to any system which needs low cost setup to perform 3D measurements in an industrial automation environment.

The system in accordance with this invention allows generating and storing the 3D model of a test object.
The system in accordance with this invention allows performing counting of objects, segregation of different objects and clustering of similar objects.
Brief Description of the Accompanying Drawings:
The invention will now be described in relation to the accompanying
drawings, in which:
Figure 1 illustrates Laser triangulation setup;
Figure 2 illustrates Viewing window of the camera;
Figure 3 illustratesCalibration object;
Figure 4 illustrates Setup for the laser plane calibration;
Figure 5 illustrates Laser line contour on the image plane for the calibration
object;
Figure 6 illustratesLaser line contours of the calibration object for different
positions;
Figure 7 illustrates Projection of the Laser plane on image plane;
Figure 8 illustrates Finding the aligned profile during calibration;
Figure 9 illustrates Pattern placed below the object during testing;
Figure 10 illustrates Different views of the test object;
Figure 11 illustrates Projection of the Laser plane on image plane with
camera axis at an angle "p" to the horizontal plane;
Figure 12 illustrates Angular representation of sphere;
Figure 13 illustrates Non-linear maps for low FOV and θ1 as zero - (a)
height-map, (b) width-map;

Figure 14 illustrates Non-linear height map for higher FOV (a) θ1 as zero, (b) θ 1 as non-zero;
Figure 15: illustrates Laser line for higher FOV and θ1 as zero; and Figure 16: illustrates Overall steps for calibration and measurements.
Detailed Description of the Invention:
The various drawings describe the laser triangulation setup and camera view, the laser plane calibration and measurement process, and the visualization of the 3D object and its different views.
In accordance with an embodiment of this invention, there is provided a setup for the laser triangulation. The laser triangulation setup is shown in Figure 1. The world (object) co-ordinate system is given below:
• The X-axis is along the laser line;
• The Y-axis is perpendicular to the laser line and parallel to the motion of the object; and
• The Z-axis is along the height of the object.
In accordance with another emobodiment of this invention, there is provided a Viewing window of the camera, such that:
The width of image is Wimg and height of image is Himg as shown in Figure 2. The camera viewing window is made such that without any object the laser line is captured in the image in the following manner. The Hoff of the fixed offset of the laser line corresponding to the height, H = 0 in the Z-axis of the object plane. The horizontal direction corresponds to the X-axis of the

object plane. Each such image is taken for different points on Y-axis, as the object moves along this axis.
The co-ordinate system for the object plane and the image planes may not be aligned. Thus there the 6 following external variables:
• Angle between X-axes of object and image;
• Angle between Y-axes of object and image;
• Angle between Z-axes of object and image;
• Shift between X-axes of object and image;
• Shift between Y-axes of object and image; and
• Shift between Z-axes of object and image.
. In accordance with yet another embodiment of this invention, there is provided a calibration means. The calibration process is adapted to take care of the above and create a mapping among them.
It may be assumed that the X-axis of the laser light arrangement (object plane) and the X-axis of the camera (image plane) are parallel (as seen in Figure 1 of the accompanying drawings).
The steps for overall calibration and measurements are shown in Figure 16.
Laser plane Calibration:
The laser triangulation setup is shown in Figure 1. The laser plane is perpendicular to the plane on which the object moves. The distortion of the laser line due to the contours of the object is viewed by the camera to detect the object profile for a particular position. The distorted laser line always

remains on the laser plane. The objective of the calibration of the laser plane is to map the X~Z object co-ordinate system to the image plane.
In order to perform the calibration, a triangular object is taken which may or may not be an equilateral triangle as shown in Figure 3. The placement of the laser light source and the direction of motion of the calibration object are shown in Figure 4.
Steps for the calibration, in accordance with this invention, are as follows:
1. Finding the H0ff in the image plane as shown in Figure 5. This is termed as the baseline of the laser light.
2. Finding the max(Z) for all profiles as the calibration object is moved along the X-axis. This gives the point B as (Xb, Zb) as shown in Figure 5.
3. Finding the points A as (Xa, Za) and C as (Xc, Zc).
4. For all points from Xa to Xc, finding the point P as (Xp, Zp).
5. Mapping the points P with reference to the actual calibration object in the world co-ordinate.
6. Generating the height map and the width map for the laser plane for the given triangulation setup by using the inclination angle of the camera with respect to the horizontal plane and the Field-of-View (FOV) of the camera. Spherical distortion of the camera is also estimated.
7. Compute the angular height map and width map in terms of measurement units (millimeter) in world co-ordinate by merging linear maps (step 5) and non-linear maps (step 6)

The Figure 6 shows the overlapped profiles of the calibration object on the image plane for different positions. The region-1 is dependant on Hc, which can be adjusted to increase the Hmax.
In accordance with still another embodiment of this invention, there is provided a method and system for Generation of height map and width map. The height map is termed as mapping the Z-axis co-ordinates (in terms of measured distance, e.g. millimeter) of the object to that of the image co-ordinate (pixels). This can also be termed as mapping the height of the laser plane from the baseline on the image plane. Following are the steps followed for the mapping:
1. Finding the baseline and map the height as 0.
2. For each profiles, find A, B and C, note that all the profiles may not have all the three points.
3. For profiles having A and B points, mapping the image points on the laser line to that of the actual height of the object along left size of the triangle.
4. For profiles having C and B points, mapping the image points on the laser line to that of the actual height of the object along right size of the triangle.
5. Performing correction of height mapping due to angular projection.
The width map is termed as mapping the X-axis co-ordinates (in terms of measured distance, e.g. millimeter) of the object to that of the image co-ordinate (pixels). This can also be termed as mapping the width of the laser plane onto the image plane. Following are the steps followed for the mapping:

3. Finding the baseline.
2. Finding the aligned profile of the calibration object. The (Xb, Z)
line is the reference for the width computation. As the calibration
object is moved along the laser line, the shape of the triangle
changes on the image plane as shown in Figure 8. The error (E)
due to the distortion is computed as:
E = (Lcright/ Lcleft) - (1.0*(Xc - Xb))/(Xb - Xa)
The profile which has the minimum error (Emin) due to the
projection in the X axis is termed as the aligned profile.
3. Computing the distance of all the points on X from Xb of the aligned profile.
4. Performing correction of width mapping due to angular projection.
The calibration object as shown in Figure 3 has a length of Lc and height of H. These measurements are in millimeter or any other scale. Thus, at a distance of Lcl from the left most point, the height is Hcl and at a distance of Lc2 from the right most point, the height is Hc2.
Hc2=Hc * LC2/Lcright Hcl = Hc *Lcl/Lcleft
The total length is Lc = Lcieft + Lcright
Now consider any point P (Xp, Zp) on the profile (as seen in Figure 5 of the accompanying drawings.)
Generation of linear height-map and width-map for Xa <= Xp <= Xb
From Figure 3 and Figure 5 of the accompanying drawings, Hcl= Hc*
LCl /Lleft, where:

Hc corresponds to Zb - Hoff
Hcl corresponds to Zp - Hoff
Lcl corresponds to Xp - Xa
Lcleft corresponds to Xb - Xa This means that the average X-axis pixel interval (in millimeter) between Xb and Xa is LCleft/( Xb - Xa).
The point (Xb, Zb) corresponds to (Lcleft, Hc)
Thus, the world point for (Xp, Zp) corresponds to (Lcl, Hcl) where, Lcl - (Xp - Xa) * (Lcleft/( Xb - Xa)) Hcl = (Xp - Xa) * (Hc/( Xb - Xa))
Thus, Hcl is the linear height mapping for the point P (Xp, Zp). Thus, Lcl is the linear width mapping for the point P (Xp, Zp).
Generation of linear height-map and width-map for Xb < Xp <= Xc
From Figure 3 and Figure 5 of the accompanying drawings, it can be seen,
Hc2 = Hc* Lc2/Lcright where
Hc corresponds to Zb - H0ff
Hc2 corresponds to Zp - H0ff
Lc2 corresponds to Xc - Xp
LCright corresponds to Xc - Xb
This means that the average X-axis pixel interval (in millimeter) between Xb and Xc is Lcright/(Xc - Xb)

Thus, the world point for (Xp, Zp) corresponds to (L-Lc2, Hc2), where, Lc2 = (Xc - Xp) * (Lcright/( Xc - Xb)) Hc2 = (Xc - Xp) * (Hc/( Xc - Xb))
Thus, Hc2 is the linear height mapping for the point P (Xp, Zp). Thus, Lc2 is the linear width mapping for the point P (Xp, Zp).
Angular map generation of width and height in pixel units
In accordance with an additional embodiment of this invention, Angular correction of height-map and width-map is calculated.
The angular correction is done based on the projection of the laser plane onto the image plane (sphere plane) as depicted in Figure 7. Consider the laser plane as the tangential plane on the equidistant spherical surface from the camera. Then consider a circle on the sphere which touches the laser plane and is perpendicular to the laser plane. This circle is shown in the figure as a dotted line and termed as "Equidistant points from camera" (EPC). This circle touches the laser plane at point "O". Now consider a point "M" on the intersection line of the laser plane and the circle plane (EPC). The line connecting the camera and the object point "M" intersects EPC at "N". Thus the projection of "M" on the laser plane onto the image plane is point "N". The sphere containing the EPC forms the image plane. The OM is the length of the laser plane. The ON is the length along the circle. Now consider the camera is tilted at an angle "β" from the horizontal axis as shown in Figure 11. The laser plane is perpendicular to the horizontal plane. The principal axis of the camera cuts the laser plane at O'. Any other line from the principal axis at an angle "0" cuts the laser plane at M'.

The above description is provided as a cross-sectional view of a sphere. Consider a sphere radius p whose center position has the camera. The laser plane is tangent to the sphere at (x, y, z) point as shown in Fig 12. The camera principal axis is along the X-axis. If the aligned line is'at angle ^ with the x -axis, then the equation of the tangent plane passed through the point where radius cut the aligned line is

r is the radius of the sphere
Now at some arbitrary combination of (θ1,1)), the Cartesian coordinate of the point in the sphere is


Where,

Where, β is the camera angle with the X-axis on the X-Z plane.
The point (x' y'z') in the tangent plane where the ray joining (0,0,0) and (x,y,z) cuts the tangent plane is

X = difference between principle po int and column in the aligned line


Now, considering two points on the sphere (S1 and S2) at (θ11, 11) and (012, 12)- If two rays are drawn from the origin which goes through Si and S2. then they intersect the laser plan at L1 and L2 respectively. L1=(x1,y1,z1)
L2 = (x2, y2, z2)
Thus, the distances between two points on the laser plane and the corresponding distances on the sphere plane (image plane) is used to generate the mapping.
For the height map correction, the base point on the sphere S1h.is taken as (011, π/2) and any arbitrary point Szh is taken at (011, 012). The corresponding points on the laser plane are Lih and L2h-
Thus the non-linear height map at P (Xp, Zp) is


Thus by varying the θ12 and θ1 the angular correction for the height-map can be performed.
For the width map correction, the base point on the sphere S1w is taken as (θ, 011) and any arbitrary point S2w is taken at (θ12, θ11). The corresponding points on the laser plane are L1w and L2w.
Thus the non-linear vertical map at P (Xp, Zp) is (Lcl*(L,w-L2w))/(p*(θ12-θ))
Thus by varying the θ11 and θ12 the angular correction for the width-map
can be performed.
Assuming that the image is COL x ROW pixels, then per-pixel angle span is

in horizontal and vertical respectively. Thus
varying θ and θ, the non-linear height map (NLHMP) and width-map (NLWMP) are generated from the following equations: For (height = 0; height < ROW; height++)
Calculate θ and θ Calculate 0'M\ Calculate ON

For (width = 0; width < COL; width++)
Calculate θ and θ, Calculate OM', Calculate ON

Sample non-linear height and width maps are shown in Figure 13. In case of the height-map, the horizontal thick black patch in the center at row of

height/2 is the reference zero line. In the case of width-map, the vertical thick black patch at column of width/2 is the reference zero line. As one moves away from the reference line, the distance increases. The image is separated by equidistant stripes where the change in the distance is shown by shades within a stripe. The equidistant lines on the image are very close to straight lines for very low FOV and low distortion camera. These lines tend to be non-linear as one increases the FOV as shown in Figure 14 (a) for zero θ1 and in Figure 14 (b) for the case of non-zero θ1. The non-linearity also depends on the property of the camera.
In accordance with an embodiment of this invention, there is provided camera estimation distortion means for Estimation of camera distortion.
In the presence of spherical distortion in the edges, the laser baseline will not be a straight line; instead it will take a shape of an arc as shown in Figure 15. If the FOV is very small (5-8 degrees) then this distortion is not visible in case of normal camera. In high end cameras the spherical distortion starts appearing at a higher FOV (above 50 degrees). Thus the structure of the baseline can be similar for a high end camera with high FOV and a low end camera with small FOV. Considering this observation, a camera distortion
parameter (Cd) is added to the computation of θ and θ. Thus the modified per-pixel angle is Cd * per-pixel angle. The Cd is estimated by performing a best fit of the observed laser baseline (Figure 15) with the various angular height-maps. The height-map that gives the least error for the baseline pixels is considered to be the final height-map and the corresponding distortion parameter is Cdmin- The width-map is computed using the Cdmin. In case of perfect laser-triangulation setup, θ1 is zero, however, the imperfection can

be compensated during the computation of Cdmin by searching for different θ1 within a small range of errors (5-10 degrees).
There is provided means for angular map generation of width and height in measurement units.
Once the Cdmin is found, the angular width and height can be computed in pixel domain as discussed before. The zero reference line of the height-map is then shifted from height/2 to the laser baseline. Now, consider the column Xbaligned corresponding to the point B in the image for the aligned profile of the calibration object (Figure 8). Compute the total number of non-linear pixel points in the height map for the rows between (Xbaligned, Zbaligned) and (Xbaligned, Zbaseline) in the column Xbaligned- This corresponds to the total height of the calibration object in the linear height map. Thus non-linear per-pixel value is computed as below:

maps for height and width in measurement units (e.g. millimeter) are derived using the following equations.
NLHMM = NLHMP * NLPPV
NLWMM = NLWMP* NLPPV
In accordance with yet an additional embodiment of this invention, Testing and Measurements is done as follows:
Once the width and height map are generated for the triangulation set, there exists a mapping of each pixel on the image plane to the millimeter scale on the object plane. The test object is moved in the direction as shown in the

Figure 1. Laser profile at each position gives cross-sectional view information of the object. Successive profiles added together give the 3D cloud points of the object. This 3D cloud point is used to perform measurements using the width and the height map, where the distance between two successive profiles is derived from the speed of the motion of the object.
The speed of the object can be found from the speed of the conveyor belt on which it is mounted. However, this requires calibration of the speed of the conveyor belt, which may change over time due to natural reasons. Thus image processing technique is used to derive the average speed of the object. A known pattern as shown in Figure 9 is placed under the object under test. The motion of the alternate black and white stripe is used to derive the average speed of the object. The speed is used to derive the distance between two profiles in millimeter or similar unit.
Visualization
The visualization of the 3D object is performed using a color map to represent the height of the object. Different views are shown in Figure 10. The front, size and top views are the three engineering views that are derived from the 3D cloud points.
Front View
The front view is generated by placing one profile behind another and overwriting the color map. This is also used to derive the width and the height of the object.

Side View
The side view is generated by rotating the profiles by 90° on the left and generating the color map. This is used to derive the depth of the object.
Top View
The top view is generated by rotating the profiles by 90° on the front and placing the one profile after another generating the color map. This is also used to derive the width of the object.
3D View
This is generated from the 3D cloud points. The rotation is employed to generate the view from any desired direction. As single camera is used, there are occluded points whose 3D information is not available. The occluded points can be filled by using two or more cameras. In this case the laser plane for each camera needs to be calibrated separately.
The 3D cloud points can be used for applications like measurements, quality control, object counting, object segregation, object clustering etc.
While considerable emphasis has been placed herein on the particular features of the preferred embodiment and the improvisation with regards to it, it will be appreciated that various modifications can be made in the preferred embodiment without departing from the principles of the invention. These and other modifications in the nature of the invention will be apparent to those skilled in the art from the disclosure herein, whereby it is to be distinctly understood that the foregoing descriptive matter is to be interpreted merely as illustrative of the invention and not as a limitation.

We Claim:
1. An apparatus for 3 dimensional measurements of an unknown (test) object adapted to move in an object plane, said apparatus comprising:
- laser triangulation setup with a laser line source to generate a laser plane;
- camera setup with a predetermined angle of camera and predetermined FOV (field of view) settings of the camera adapted to generate an image plane;
- calibration means adapted to perform calibration of said laser triangulation setup using a calibration object of a known size and known measurements in order to define an area bound by X-axis, Y-axis, Z-axis, said calibration means includes:
- video means to record movement of aid calibration object;
- X-axis calibration means adapted to calibrate said X-axis along said laser line with respect to said recorded video;
- Y-axis calibration means adapted to calibrate said Y-axis perpendicular to said laser line and parallel to motion of said unknown (test) object with respect to said recorded video;
- Z-axis calibration means adapted to calibrate said Z-axis along height of said unknown (test) object with respect to said recorded video;
- means to move said unknown (test) object along said calibrated Y-
axis
valid profile extraction means adapted to extract valid profiles of said unknown (test) object;

- height map generation means adapted to generate height map for said laser plane;
- width map generation means to generate width map for said laser plane;
- aligned line generation means adapted to generate an aligned line in vertical direction of said laser plane;
- camera distortion estimation means adapted to estimate camera distortion based on predefined parameters; and
- 3D cloud point generation means adapted to generate 3D cloud points for a test object under said calibrated setup using said generated height map, said generated width map, said aligned line, said calibrated axes, and said estimated camera distortion in order to determine dimensions of said unknown (test) object.

2. A system as claimed in claim 1 wherein, said calibration means includes height offset determination means adapted to determine height offset of said calibration object in said image plane with respect to said object plane.
3. A system as claimed in claim 1 wherein, said calibration means includes height determination means adapted to determine height of said unknown (test) object through said camera at predefined intervals of time.

4. A system as claimed in claim 1 wherein, said calibration means includes width determination means adapted to determine width of said unknown (test) object through said camera at predefined intervals of time.
5. A system as claimed in claim 1 wherein, said calibration means includes mapping means adapted to map said predetermined height and weight with said calibration object in said object plane.
6. A system as claimed in claim 1 wherein, said height map generation means includes linear height map generation means.
7. A system as claimed in claim 1 wherein, said width map generation means includes linear width map generation means.
8. A system as claimed in claim 1 wherein, said height map generation means includes angular height map generation means.
9. A system as claimed in claim 1 wherein, said width map generation means includes angular width map generation means.
10.A system as claimed in claim 1 wherein, said height map generation means includes angular height map generation means in pixel units.

11.A system as claimed in claim 1 wherein, said width map generation means includes angular width map generation means in pixel units.
12.A system as claimed in claim 1 wherein, said height map generation means includes angular height map generation means in measurement units.
13.A system as claimed in claim 1 wherein, said width map generation means includes angular width map generation means in measurement units.
14. A system as claimed in claim 1 wherein, said calibration means includes generation means adapted to generate said height map and said weight map for said laser plane for given triangulation setup by using inclination angle of said camera with respect to said horizontal plane and said predetermined FOV (field of view) settings of said camera.
15.A system as claimed in claim 1 wherein, said system includes mapping means adapted to map baseline of said object plane with respect to said image plane and said unknown (test) object in said image plane.
16.A system as claimed in claim 1 wherein, said system includes mapping means adapted to map image points from image plane of said unknown (test) object with respect to said laser plane.

17 A system as claimed in claim 1 wherein, said system includes matching means adapted to perform matching of said unknown (test) object with a corresponding calibration model object for quality control for recognizing said unknown (test) object upon passing through said determined object plane defined by said laser plane and image plane.

Documents

Orders

Section Controller Decision Date

Application Documents

# Name Date
1 2527-MUM-2009-CORRESPONDENCE(5-2-2010).pdf 2018-08-10
1 2527-MUM-2009-FORM 5(27-10-2010).pdf 2010-10-27
2 2527-MUM-2009-FORM 2(TITLE PAGE)-(27-10-2010).pdf 2010-10-27
2 2527-MUM-2009-CORRESPONDENCE(IPO)-(DECISION)-(31-7-2017).pdf 2018-08-10
3 2527-mum-2009-form 2(27-10-2010).pdf 2010-10-27
3 2527-MUM-2009-CORRESPONDENCE(IPO)-(HEARING NOTICE)-(4-5-2017).pdf 2018-08-10
4 2527-mum-2009-correspondence.pdf 2018-08-10
5 2527-MUM-2009-DRAWING(27-10-2010).pdf 2010-10-27
5 2527-mum-2009-description(provisional).pdf 2018-08-10
6 2527-mum-2009-drawing.pdf 2018-08-10
6 2527-MUM-2009-DESCRIPTION(COMPLETE)-(27-10-2010).pdf 2010-10-27
7 2527-MUM-2009-FORM 1(5-2-2010).pdf 2018-08-10
7 2527-MUM-2009-CORRESPONDENCE(27-10-2010).pdf 2010-10-27
8 2527-mum-2009-form 1.pdf 2018-08-10
8 2527-MUM-2009-CLAIMS(27-10-2010).pdf 2010-10-27
9 2527-mum-2009-form 2(title page).pdf 2018-08-10
10 2527-MUM-2009-ABSTRACT(27-10-2010).pdf 2010-10-27
10 2527-mum-2009-form 2.pdf 2018-08-10
11 2527-mum-2009-form 26.pdf 2018-08-10
12 2527-MUM-2009-FORM 18(30-11-2010).pdf 2010-11-30
12 2527-mum-2009-form 3.pdf 2018-08-10
13 2527-MUM-2009-CORRESPONDENCE(30-11-2010).pdf 2010-11-30
13 2527-MUM-2009_EXAMREPORT.pdf 2018-08-10
14 abstract1.jpg 2018-08-10
14 Examination Report Reply Recieved [07-07-2016(online)].pdf 2016-07-07
15 2527-MUM-2009-ORIGINAL UNDER RULE 6 (1A)-28-06-2017.pdf 2017-06-28
15 Description(Complete) [07-07-2016(online)].pdf 2016-07-07
16 Written submissions and relevant documents [19-06-2017(online)].pdf 2017-06-19
16 Correspondence [07-07-2016(online)].pdf 2016-07-07
17 Claims [07-07-2016(online)].pdf 2016-07-07
17 Abstract [07-07-2016(online)].pdf 2016-07-07
18 Abstract [07-07-2016(online)].pdf 2016-07-07
18 Claims [07-07-2016(online)].pdf 2016-07-07
19 Correspondence [07-07-2016(online)].pdf 2016-07-07
19 Written submissions and relevant documents [19-06-2017(online)].pdf 2017-06-19
20 2527-MUM-2009-ORIGINAL UNDER RULE 6 (1A)-28-06-2017.pdf 2017-06-28
20 Description(Complete) [07-07-2016(online)].pdf 2016-07-07
21 abstract1.jpg 2018-08-10
21 Examination Report Reply Recieved [07-07-2016(online)].pdf 2016-07-07
22 2527-MUM-2009-CORRESPONDENCE(30-11-2010).pdf 2010-11-30
22 2527-MUM-2009_EXAMREPORT.pdf 2018-08-10
23 2527-MUM-2009-FORM 18(30-11-2010).pdf 2010-11-30
23 2527-mum-2009-form 3.pdf 2018-08-10
24 2527-mum-2009-form 26.pdf 2018-08-10
25 2527-MUM-2009-ABSTRACT(27-10-2010).pdf 2010-10-27
25 2527-mum-2009-form 2.pdf 2018-08-10
26 2527-mum-2009-form 2(title page).pdf 2018-08-10
27 2527-MUM-2009-CLAIMS(27-10-2010).pdf 2010-10-27
27 2527-mum-2009-form 1.pdf 2018-08-10
28 2527-MUM-2009-CORRESPONDENCE(27-10-2010).pdf 2010-10-27
28 2527-MUM-2009-FORM 1(5-2-2010).pdf 2018-08-10
29 2527-MUM-2009-DESCRIPTION(COMPLETE)-(27-10-2010).pdf 2010-10-27
29 2527-mum-2009-drawing.pdf 2018-08-10
30 2527-MUM-2009-DRAWING(27-10-2010).pdf 2010-10-27
30 2527-mum-2009-description(provisional).pdf 2018-08-10
31 2527-mum-2009-correspondence.pdf 2018-08-10
32 2527-mum-2009-form 2(27-10-2010).pdf 2010-10-27
32 2527-MUM-2009-CORRESPONDENCE(IPO)-(HEARING NOTICE)-(4-5-2017).pdf 2018-08-10
33 2527-MUM-2009-FORM 2(TITLE PAGE)-(27-10-2010).pdf 2010-10-27
33 2527-MUM-2009-CORRESPONDENCE(IPO)-(DECISION)-(31-7-2017).pdf 2018-08-10
34 2527-MUM-2009-FORM 5(27-10-2010).pdf 2010-10-27
34 2527-MUM-2009-CORRESPONDENCE(5-2-2010).pdf 2018-08-10