Sign In to Follow Application
View All Documents & Correspondence

System And Method For Stereo Object Detection And Distance Computation

Abstract: The present disclosure discloses a method and system for determining a distance between a vehicle and an object appearing in a path of the vehicle. A stereo camera coupled with the vehicle may capture left-image and right-image of an object in path of vehicle. Further, the left and the right image may comprise first-set of pixels and second-set of pixels respectively. Further, semi-global matching technique may be applied on the first-set and second-set of pixels by matching a subset of the first-set of pixels with a subset of the second-set of pixels, wherein the subset of the first set of pixels correspond to predefined rows and predefined columns in the first matrix. Based on matching, a disparity map is generated comprising disparity values. Further, system computes, using triangulation technique, distance between the object and the vehicle based on the disparity values, a focal length and baseline of the stereo camera.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
16 April 2014
Publication Number
47/2015
Publication Type
INA
Invention Field
COMMUNICATION
Status
Email
ip@legasis.in
Parent Application
Patent Number
Legal Status
Grant Date
2022-02-14
Renewal Date

Applicants

Tata Consultancy Services Limited
Nirmal Building, 9th Floor, Nariman Point, Mumbai 400021, Maharashtra, India

Inventors

1. C R, Manoj
Tata Consultancy Services Limited, Salarpuria G. R. Tech Park, JAL Block, Mahadevapura, K R Puram, Bangalore - 560066, Karnataka, India
2. PATIL, Prabhudev
Tata Consultancy Services Limited, Salarpuria G. R. Tech Park, JAL Block, Mahadevapura, K R Puram, Bangalore - 560066, Karnataka, India

Specification

DESC:FORM 2

THE PATENTS ACT, 1970
(39 of 1970)
&
THE PATENT RULES, 2003

COMPLETE SPECIFICATION
(See Section 10 and Rule 13)

Title of invention:
SYSTEM AND METHOD FOR STEREO OBJECT DETECTION AND DISTANCE COMPUTATION

Applicant:
Tata Consultancy Services Limited
A company Incorporated in India under The Companies Act, 1956
Having address:
Nirmal Building, 9th Floor,
Nariman Point, Mumbai 400021,
Maharashtra, India

The following specification particularly describes the invention and the manner in which it is to be performed.
CROSS-REFERENCE TO RELATED APPLICATIONS AND PRIORITY
[001] The present application claims priority to Indian Provisional Patent Application No. 1347/MUM/2014, filed on April 16, 2014, the entirety of which is hereby incorporated by reference
TECHNICAL FIELD
[002] The present disclosure described herein, in general, relates to an automotive driver assistance and safety system, more particularly relates to a system and method for detecting objects within short to medium ranges and computing distances of the detected objects using a stereo camera.
BACKGROUND
[003] Statistical analysis of road accidents and corresponding casualties worldwide highlights the fact that road safety and driver assistance functionalities are of critical significance in the automobile design and manufacturing industry. One of the important factors responsible for the road accidents is either drivers’ ignorance or the reduced range vision on the roads. The conventional safety features like seat belts, airbags, Antilock braking systems (ABS) which are common features in the automobiles today, help the driver to reduce the severity of the accidents. However, the Advanced Driver Assist Systems (ADAS) features are the ones that actively help the driver to avoid the accidents by giving some early alerts to the driver and if required take over the control of the car from the driver.
[004] For example, in case of collision avoidance systems (CAS), when an obstacle is detected in a path of the vehicle, automatic brakes are applied with the help of onboard camera and image processing algorithms executed by an onboard microprocessor. For the CAS to be efficient, it is necessary that the CAS needs to be accurate in detecting the objects at up to close distances around 5m. At the same time, the CAS needs to be computationally efficient so that the CAS is able to provide real time response to the driver.
SUMMARY
[005] In one implementation, a system for determining a distance between a vehicle and an object appearing in a path of the vehicle is disclosed. The system comprises a processor and a memory coupled to the processor. The processor executes a plurality of modules stored in the memory. The plurality of modules comprises a receiving module, semi-global matching module, distance computing module. The receiving module may receive a left image and a right image comprising an object appearing in a path of a vehicle. The left image and the right image may be captured by a stereo camera coupled with the vehicle. Further, the left image and the right image may comprise a first-set of pixels and a second-set of pixels respectively. Further, the first-set of pixels and the second-set of pixels may be separately arranged in a first matrix and a second matrix respectively, wherein the first matrix and the second matrix comprise rows and columns. Further, the semi-global matching module may apply a semi-global matching technique on the left image and the right image by matching a subset of the first-set of pixels with a subset of the second-set of pixels. The subset of the first set of pixels may correspond to predefined rows and predefined columns in the first matrix. Further, the semi-global matching module may generate a disparity map based on the matching. The disparity map generated may comprise disparity values based on the matching of the subset of the first-set of pixels and the subset of second-set pixels. Further, the distance computing module may compute, by using triangulation technique, the distance between the object and the vehicle based on the disparity values, a focal length of the stereo camera, and a baseline of the stereo camera.
[006] In another implementation, a method for determining a distance between a vehicle and an object appearing in a path of the vehicle is disclosed. The method may comprise receiving by a processor, a left image and a right image comprising an object appearing in a path of a vehicle. The left image and the right image may be captured by a stereo camera coupled with the vehicle. Further, the left image and the right image may comprise a first-set of pixels and a second-set of pixels respectively. Further, the first-set of pixels and the second-set of pixels may be separately arranged in a first matrix and a second matrix respectively, wherein the first matrix and the second matrix may comprise rows and columns. The method may further comprise a step of applying, by a processor, a semi-global matching technique on the left image and the right image by matching a subset of the first-set of pixels with a subset of the second-set of pixels. The subset of the first set of pixels may correspond to predefined rows and predefined columns in the first matrix. Further, the method may comprise a step of generating a disparity map based on the matching. The disparity map generated may comprise disparity values based on the matching of the subset of the first-set of pixels and the subset of second-set pixels. The method may further comprise a step of computing, by using triangulation technique, the distance between the object and the vehicle based on the disparity values, a focal length of the stereo camera, and a baseline of the stereo camera.
[007] Yet in another implementation a non-transitory computer readable medium embodying a program executable in a computing device for determining a distance between a vehicle and an object appearing in a path of the vehicle is disclosed. The program may comprise a program code for receiving a left image and a right image comprising an object appearing in a path of a vehicle. The left image and the right image may be captured by a stereo camera coupled with the vehicle. Further, the left image and the right image may comprise a first-set of pixels and a second-set of pixels respectively. Further, the first-set of pixels and the second-set of pixels may be separately arranged in a first matrix and a second matrix respectively, wherein the first matrix and the second matrix comprise rows and columns. The program may further comprise a program code for applying a semi-global matching technique on the left image and the right image by matching a subset of the first-set of pixels with a subset of the second-set of pixels. The subset of the first set of pixels may correspond to predefined rows and predefined columns in the first matrix. Further, the program may comprise a program code for generating a disparity map based on the matching. The disparity map generated may comprise disparity values based on the matching of the subset of the first-set of pixels and the subset of second-set pixels. The program may further comprise a program code for computing, by using triangulation technique, the distance between the object and the vehicle based on the disparity values, a focal length of the stereo camera, and a baseline of the stereo camera.
BRIEF DESCRIPTION OF THE DRAWINGS
[008] The detailed description is described with reference to the accompanying figures. In the figures, the left-most digit(s) of a reference number identifies the figure in which the reference number first appears. The same numbers are used throughout the drawings to refer like features and components.
[009] Figure 1 illustrates a network implementation of a system determining a distance between a vehicle and an object, in accordance with an embodiment of the present disclosure.
[0010] Figure 2 illustrates the system, in accordance with an embodiment of the present disclosure.
[0011] Figure 3A-3B illustrates detail explanation of the system, in accordance with an embodiment of the present disclosure.
[0012] Figure 4 illustrates the definition and computation of dissimilarity, in accordance with an embodiment of the present disclosure.
[0013] Figure 5 illustrates aggregation of costs in disparity space, in accordance with an embodiment of the present disclosure.
[0014] Figure 6 illustrates method for determining a distance between a vehicle and an object, in accordance with an embodiment of the present disclosure.
DETAILED DESCRIPTION
[0015] Systems and methods for determining a distance between a vehicle and an object appearing in a path of the vehicle are described. Different objects like a vehicle, an animal, pedestrian or any other object may come in front of the vehicle while driving on a road. Detecting these objects and locating position of these objects from the vehicle is a prime concern in road safety. Detecting the position of these objects facilitates in assisting driver of the vehicle or any other authority to prevent or avoid of any chances of an accident. According to embodiments of present disclosure, a stereo camera, coupled with the vehicle, may capture a left image and a right image of one or more objects appearing on the path of the vehicle. In a first step, the system may receive the left image and the right image captured by the stereo camera. The left image and the right image may comprise an object appearing in the path of the vehicle. The left image and the right image may be normalized and further enhanced to improve quality of the left and the right images under various scenarios like changing illumination conditions and terrains. The left image and the right image, after being enhanced, may then be interpolated to enable sub-pixel level processing of the images. The level of interpolation may be pre-defined based on the parameters of the stereo camera and the targeted distance of detection i.e. the object.
[0016] Subsequently, a semi-global matching approach may be used at a sub-pixel level of the left and the right image to find out the best match pixels between the left image and the right image. According to embodiments of present disclosure, the semi-global matching approach may combine efficiency of both local and global methods of pixel matching and gives an optimum runtime performance to the system, and further may be implemented in low cost hardware as well. Based on the matching, a disparity map may be generated. The disparity map generated may comprise disparities values obtained based on the matching of the pixels of the left and the right image. Further, the system may compute, by using triangulation technique, the distance between the object and the vehicle based on the disparity values, a focal length of the stereo camera, and baseline of the stereo camera.
[0017] While aspects of described device and method for determining a distance between a vehicle and an object appearing in a path of the vehicle may be implemented in any number of different computing devices, environments, and/or configurations, the embodiments are described in the context of the following exemplary devices.
[0018] Referring to Figure 1, a network implementation 100 of a system 102 for determining a distance between a vehicle (108) and an object appearing in a path of the vehicle (108) is illustrated, in accordance with an embodiment of the present disclosure. In one embodiment, the system 102 facilitates computing the distance of the objects appearing on the path of the vehicle (108). Although the present subject matter is explained considering that the system 102 is implemented as a software application on a server, it may be understood that the system 102 may also be implemented as a variety of computing systems, such as a laptop computer, a desktop computer, a notebook, a workstation, a mainframe computer, a server, a network server, a tablet, a mobile phone, and the like. In one implementation, the system 102 may be implemented in a cloud-based environment. According to embodiments, a stereo camera 104 may be coupled with the vehicle 108. Further, the stereo camera 104 may communicatively coupled to the system 102 through a network 106.
[0019] In one implementation, the network 106 may be a wireless network, a wired network or a combination thereof. The network 106 can be implemented as one of the different types of networks, such as intranet, local area network (LAN), wide area network (WAN), the internet, and the like. The network 106 may either be a dedicated network or a shared network. The shared network represents an association of the different types of networks that use a variety of protocols, for example, Hypertext Transfer Protocol (HTTP), Transmission Control Protocol/Internet Protocol (TCP/IP), Wireless Application Protocol (WAP), and the like, to communicate with one another. Further the network 106 may include a variety of network devices, including routers, bridges, servers, computing devices, storage devices, and the like.
[0020] Referring now to Figure 2, the system 102 is illustrated in accordance with an embodiment of the present disclosure. In one embodiment, the system 102 may include at least one processor 202, an input/output (I/O) interface 204, and a memory 206. The at least one processor 202 may be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, state machines, logic circuitries, and/or any devices that manipulate signals based on operational instructions. Among other capabilities, the at least one processor 202 is configured to fetch and execute computer-readable instructions or modules stored in the memory 206.
[0021] The I/O interface 204 may include a variety of software and hardware interfaces, for example, a web interface, a graphical user interface, and the like. The I/O interface 204 may allow the system 102 to interact with the user directly or through the user devices. Further, the I/O interface 204 may enable the system 102 to communicate with stereo camera 104 coupled with the vehicle 108, and other computing devices, such as web servers and external data servers (not shown). The I/O interface 204 can facilitate multiple communications within a wide variety of networks and protocol types, including wired networks, for example, LAN, cable, etc., and wireless networks, such as WLAN, cellular, or satellite. The I/O interface 204 may include one or more ports for connecting a number of devices to one another or to another server.
[0022] The memory 206 may include any computer-readable medium or computer program product known in the art including, for example, volatile memory, such as static random access memory (SRAM) and dynamic random access memory (DRAM), and/or non-volatile memory, such as read only memory (ROM), erasable programmable ROM, flash memories, hard disks, optical disks, a compact disks (CDs), digital versatile disc or digital video disc (DVDs) and magnetic tapes. The memory 206 may include modules 208 and data 232.
[0023] The modules 208 include routines, programs, objects, components, data structures, etc., which perform particular tasks or implement particular abstract data types. In one implementation, the modules 208 may include a receiving module 210, Sobel filter 212, an interpolation module 214, a semi-global matching module 216, a distance computation module 218, and other modules 220. The other modules 220 may include programs or coded instructions that supplement applications and functions of the system 102.
[0024] The data 222, amongst other things, serves as a repository for storing data processed, received, and generated by one or more of the modules 208. The data 222 may also include a data store 224, and other data 226.The other data 226 may include data generated as a result of the execution of one or more modules in the other modules 220.
[0025] Referring now to Figure 3A-3B illustrates detailed working of the system 102, in accordance with an embodiment of the present disclosure. The system 102 is configured for determining a distance between a vehicle 108 and an object appearing in a path of the vehicle 108. According to embodiments, a stereo camera 104 may be coupled with the vehicle 108. The stereo camera 104 may be configured for capturing images comprising left image and a right image of the objects appearing in the path of the vehicle 108. In the first step, the receiving module 210 of the system 102 may receive the left image and the right image from the stereo camera 104. The left image and the right image may comprise an object appearing in the path of the vehicle 108. Further, the left image and the right image may comprise a first-set of pixels and a second-set of pixels respectively. Further, the first-set of pixels and the second-set of pixels are separately arranged in a first matrix and a second matrix respectively, wherein the first matrix and the second matrix may comprise rows and columns. Further, the Sobel filter 212 of the system 102 may be applied on the left image and the right image in order to detect edges from the left image and the right image.
[0026] Further, the interpolation module 214 may support in improvement of efficiency of the pixel matching, whereby the enhanced left image and the right image may be interpolated to enable sub-pixel level processing. The level of interpolation may be decided based on the detection distance (i.e., a target distance of the object) and parameters of the stereo camera.
[0027] Further, the semi-global matching module 216 of the system 102 may be configured to apply a semi-global matching technique on the left image and the right image. Conventionally available semi-global matching technique is computationally complex and heavy to be implemented in low cost embedded systems. Thus, to reduce the computational complexity in the embedded systems, the present disclosure provides modification in the conventional semi-global matching technique. The modification leads to reduced execution time or run-time of the embedded systems. According to embodiments of present disclosure, the semi-global matching algorithm 218 of the system 102 applied the semi-global matching technique by matching a subset of the first-set of pixels with a subset of the second-set of pixels. The subset of the first set of pixels may correspond to predefined rows and predefined columns in the first matrix. Thus, in the present disclosure, the semi-global matching module 218 chooses only few pixels from selected rows and columns of the first matrix (corresponding to the first-set of pixels) for the matching, instead of selecting pixels from all rows and columns. Based on the matching, a disparity map may be generated. Further, the disparity map generated may comprise disparity values based on the matching of the subset of the first-set of pixels and the subset of second-set pixels. Since, only selected rows and columns were considered while matching the pixels, it must be understood that a remaining set of rows and columns (other than selected rows and columns) of the disparity map may be left un-computed (i.e., without having any disparity values). For this, the semi-global matching module 216 may fill these un-computed columns with a saturation cost aggregated value which is in turn used to compute the final disparity map. For the un-computed rows, the final disparity values may be filled with the previous row’s row data. With this approach, the quality of disparity map and thus the distance calculation to the objects remain largely un-affected but at the same time the processing time of disparity computation gets improved drastically enabling real time implementation on low cost hardware.
[0028] In one example of present disclosure, the subset of the first-set of pixels may be matched in such a manner that each pixel, of the first-set of pixel, present at an interval of 2nd row and 3rd column (i.e., the selected row and column) of the first matrix, is matched with its corresponding pixels in the second-set of pixels. In another example, the subset of the first-set of pixels may be matched in such a manner that each pixel, of the first-set of pixel, present at an interval of 3rd row and 4th column (i.e., the selected row and column) of the first matrix, is matched with its corresponding pixels in the second-set of pixels Further, for the non-computed rows of the first-set of pixels and the second-set of pixels, previous values of the pixels may be provided in the disparity map. Further, for the non-computed columns, of the first-set of pixels and the second-set of pixels, a saturation value which is arrived by heuristic model may be provided as the aggregated cost value which is in turn used to compute the disparity map It may be noted to a person skilled in art that the intervals of rows and columns may be selected in different permutation and combinations for applying the semi-global matching technique. Further, above matching technique at different intervals of rows and columns lead to optimization of run time of the system 102.
[0029] Further, the semi-global matching algorithm enhances quality of disparity map and outperforms in terms of robustness under difficult imaging conditions. As shown in figure 3A and 3B, the steps performed by the semi-global matching module 216 may comprise: a) computation of matching cost, b) aggregating the matching cost, and c) computing/optimizing disparity. The detail explanation of the semi-global matching algorithm is explained in subsequent paragraphs of the specification.
[0030] The matching cost computation approach is based on the absolute, squared or sampling insensitive difference of intensities or colors. Since the costs are sensitive to radiometric differences, costs based on image gradients may also be used. In the matching cost computation approach, a self-adapting dissimilarity measure may be incorporated that combines sum of absolute intensity differences (SAD) and a gradient based measure as follows:

Where, N(x, y) 3 × 3 surrounding window at position (x, y)
Nx(x, y) a surrounding window without the rightmost column
Ny(x, y) a surrounding window without the lowest row,
the forward gradient to the right and
the forward gradient to the bottom
[0031] An optimal weighting ? between CSAD and CGRAD may be determined by maximizing the number of reliable correspondences that are filtered out by applying a cross-checking test i.e. comparing left-to-right and right-to-left disparity map, in conjunction with a winner-take-all optimization i.e. choosing the disparity with the lowest matching cost. The resulting dissimilarity measure is given by:
[0032] Furthermore, the reliable correspondences may be utilized to predict the Signal to Noise Ratio (SNR) that is used to normalize the dissimilarity measure. Because of the normalization a fixed truncation threshold may be set right above the noise level to obtain a robust matching score.
[0033] Additionally, the dissimilarity between the pixels is computed as the minimum of the quantity and its symmetric counterpart. Thus, the definition of disparity (d) is symmetrical. The definition and computation of the disparity is as shown in figure 4.

[0034] In the matching cost computation approach, C(d) = |Ib(x, y) - Im(x - d, y)| for every pixel of the first-set of pixels corresponding to the left-image is stored at the cost buffer. The size of cost buffer is [(width - maxD)*maxD], every pixel from the first-set of pixels of the left image is compared with (1 to maxD) element in second-set of pixels of the right image and the difference value is stored in the cost buffer i.e. each row of the cost buffer contains32/64/96 pixel difference values for every pixel in the left image. The cost buffer is used later to find objective function and the cost aggregation.
[0035] Furthermore, in the cost aggregation, pixel-wise cost calculation may be generally ambiguous and wrong matches may easily have a lower cost than correct ones, due to noise. Therefore, to overcome this, an additional constraint may be added that supports smoothness by penalizing changes of neighboring disparities. The pixel-wise cost and the smoothness constraints are expressed by defining the energy E(D) that depends on the disparity image D. According to one embodiment of present disclosure, the steps discussed in paragraphs [0033] to [0036] may be performed on the pixels present at an interval of 3rd row of the first matrix associated with the left image. Further, the cost aggregation may also be performed on pixels present at selected set of columns only. For the remaining columns (i.e., the un-computed columns), a pixel-wise cost may be filled with a saturation value of cost which is being arrived using heuristic model.

= (sum of all pixel matching costs for the disparities of D) + (Sum of constant penalty P1 for all pixels q in the neighborhood Np of p, for which the disparity changes a little bit (i.e. 1 pixel)) + (Sum of larger constant penalty P2, for all larger disparity changes)
[0036] Using a lower penalty for small changes permits an adaptation to slanted or curved surfaces. The constant penalty for all larger changes preserves discontinuities. Discontinuities are often visible as intensity changes. The problem of the stereo matching may now be formulated as finding the disparity image D that minimizes the energy E(D). The aggregated (smoothed) cost S (p, d) for a pixel p and disparity d is calculated by summing the costs of all 1D minimum cost paths that end in pixel p at disparity (d), as shown in figure. 5. The paths as shown in figure 5, through disparity space are projected as straight lines into the base image, but as non-straight lines into the corresponding match image, according to disparity changes along the paths. It is noteworthy that only the cost of the path is required and not the path itself.
[0037] The cost L' r (p, d) along a path traversed in the direction r of the pixel p at disparity (d) is defined recursively as,
[0038] However, the upper limit may now be given as L <= Cmax + P2. The costs Lr are summed over paths in all directions r. The number of paths is being chosen as 5 as this is proven sufficient for providing a good coverage of the 2-d images
[0039] The upper limit for S is easily determined as S<=8(Cmax + P2) for 5 paths. An efficient implementation would pre-calculate the pixel-wise matching costs C (p, d), down-scaled to 11 bit integer values. Further, scaling to 11 bit guarantees that the aggregated costs in subsequent calculations do not exceed the 16 bit limit. All costs are stored in a 16 bit array C [ ] of size W × H × D. The calculation starts for each direction r at all pixels b of the image border with Lr (b, d) = C [b, d]. The path is traversed in forward direction according to cost equation. For each visited pixel p along the path, the costs Lr (p, d) are added to the values S[b, d] for all disparities d. The calculation of (13) requires O (D) steps at each pixel, since the minimum cost of the previous pixel, e.g. mink Lr (p -r, k), is constant for all disparities of a pixel and can be pre-calculated. Each pixel is visited exactly 16 times, which results in a total complexity of O (WHD). The regular structure and simple operations, i.e., additions and comparisons, permit parallel calculations using integer based SIMD (Single Instruction Multiple Data) instructions.
[0040] Further, minimizing E(D) in a two-dimensional manner may be very costly. Therefore, the semi-global matching algorithm simplifies the optimization by traversing one-dimensional paths and ensures the constraints with respect to these explicit directions. This approach requires a second phase known as cost aggregation. The below equation 3 describes the procedure for a horizontal path from the left to the right in an arbitrary image row y.
E(x, y, d) =C(x, y, d) + min [E(x - 1, y, d), E(x - 1, y, d - 1) + P1, E(x - 1, y, d + 1) + P1, min E(x - 1, y, i) + P2)] (3)
[0041] The final (smoothed) costs for each pixel and each disparity S (x, y, d) are obtained by summing the costs Er (x, y, d) of paths in all directions r as: S(x, y, d) = ? E(x, y, d) (4)
[0042] According to disparity computation/optimization approach, the disparity image Db that corresponds to the base image Ib is determined as in local stereo methods by selecting for each pixel p the disparity d that corresponds to the minimum cost, i.e. mind S [p, d]. For sub-pixel estimation, a quadratic curve is fitted through the neighboring costs, i.e., at the next higher and lower disparity, and the position of the minimum is calculated. Using a quadratic curve is theoretically justified only for correlation using the sum of squared differences. The disparity image Dm that corresponds to the match image Im can be determined from the same costs, by traversing the epipolar line that corresponds to the pixel q of the match Image. Again, the disparity d may be selected, which corresponds to the minimum cost, i.e. mind S [emb(q, d), d]. However, the cost aggregation step does not treat the base and match images symmetrically. Slightly better results may be obtained, if Dm is calculated from scratch, i.e. by performing the pixel-wise matching and aggregation, but with Im as base and Ib as match image. It depends on the application whether or not an increased runtime is acceptable for slightly better object borders. Outliers are filtered from Db and Dm, using a median filter with a small window, i.e. 3 × 3.
[0043] The calculation of Db as well as Dm permits the determination of occlusions and false matches by performing a consistency check. Each disparity of Db is compared to its corresponding disparity of Dm. The disparity is set to invalid (Dinv) if both differ.

[0044] For the rows of the disparity map, for which the disparity value (Dp) have not been computed, may be filled with previously computed disparity values corresponding to selected rows (as discussed above). The consistency check enforces the uniqueness constraint, by permitting one to one mappings only. The disparity computation and consistency check require visiting each pixel at each disparity at constant number of times.
[0045] Further, the distance computing module 220 of the system 102 compute, by using triangulation technique, the distance between the object and the vehicle based on the disparity values, focal length of the stereo camera 104, and baseline of the stereo camera 104. Further, the distance may be computed as d = B*f/(xl - xr) ,where B is the base line i.e. distance between two stereo cameras, and f is the focal length of the stereo camera support in the data integration, the functionality/capability integration, the features integration, and the reports integration.
[0046] Referring now to Figure 6, method for determining a distance between a vehicle and an object appearing in a path of the vehicle is shown, in accordance with an embodiment of the present disclosure. The method 600 may be described in the general context of computer executable instructions. Generally, computer executable instructions can include routines, programs, objects, components, data structures, procedures, modules, functions, etc., that perform particular functions or implement particular abstract data types. The method 600 may also be practiced in a distributed computing environment where functions are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, computer executable instructions may be located in both local and remote computer storage media, including memory storage devices.
[0047] The order in which the method 600 is described is not intended to be construed as a limitation, and any number of the described method blocks can be combined in any order to implement the method 600 or alternate methods. Additionally, individual blocks may be deleted from the method 600 without departing from the spirit and scope of the disclosure described herein. Furthermore, the method can be implemented in any suitable hardware, software, firmware, or combination thereof. However, for ease of explanation, in the embodiments described below, the method 600 may be considered to be implemented in the above described system 102.
[0048] At block 602, a left image and a right image may be received. The left image and the right image may comprise an object appearing in a path of a vehicle. Further, the left image and the right image may captured by a stereo camera coupled with the vehicle. Further, the left image and the right image may comprise a first-set of pixels and a second-set of pixels respectively. The first-set of pixels and a second-set of pixels may be separately arranged in a first matrix and a second matrix respectively, wherein the first matrix and the second matrix comprise rows and columns.
[0049] At block 604, a semi-global matching technique may be applied as described in blocks 604A and 604B.
[0050] At block 604A, a subset of the first-set of pixels is matched with a subset of the second-set of pixels. Further, the subset of the first set of pixels may correspond to predefined rows and predefined columns in the first matrix.
[0051] At block 604B, a disparity map may be generated based on the matching. Further, the disparity map may comprise disparity values based on the matching of the subset of the first-set of pixels and the subset of second-set pixels.
[0052] At block 606, distance between the object and the vehicle may be computed, using a triangulation technique, based on the disparity values, a focal length of the stereo camera, and a baseline of the stereo camera.
[0053] Although implementations for methods and devices for determining a distance between a vehicle and an object appearing in a path of the vehicle have been described in language specific to structural features and/or methods, it is to be understood that the appended claims are not necessarily limited to the specific features or methods described. Rather, the specific features and methods are disclosed as examples of implementations for determining a distance between a vehicle and an object appearing in a path of the vehicle.
,CLAIMS:WE CLAIM:

1. A method for determining a distance between a vehicle and an object appearing in a path of the vehicle, the method comprising:
receiving, by a processor, a left image and a right image comprising an object appearing in a path of a vehicle, wherein the left image and the right image is captured by a stereo camera coupled with the vehicle, and wherein the left image and the right image comprises a first-set of pixels and a second-set of pixels respectively, and wherein the first-set of pixels and the second-set of pixels are separately arranged in a first matrix and a second matrix respectively, wherein the first matrix and the second matrix comprises rows and columns;
applying, by a processor, a semi-global matching technique on the left image and the right image by:
matching a subset of the first-set of pixels with a subset of the second-set of pixels, wherein the subset of the first set of pixels correspond to predefined rows and predefined columns in the first matrix, and
generating a disparity map based on the mapping, wherein the disparity map comprises disparity values based on the mapping of the subset of the first-set of pixels and the subset of second-set pixels; and
computing, by using triangulation technique, the distance between the object and the vehicle based on the disparity values, a focal length of the stereo camera, and a baseline of the stereo camera.

2. The method of claim 1, further comprising determining edges of the left-image and the right image using a Sobel filter.

3. The method of claim 1, further comprising interpolating the left image and the right image, after being normalized, to a predefined level based on parameters of the stereo camera and a target distance of the object, wherein the interpolation enables a sub-pixel level processing of the left image and the right image.
4. A system (102) for determining a distance between a vehicle (108) and an object appearing in a path of the vehicle (108), wherein the system (102) comprises:
a processor (202); and
a memory (206) coupled to the processor (202), wherein the memory (206) has a plurality of modules (208) stored therein that are executable by the processor (202), the plurality of modules (208) comprising:
receiving module (210) to receive a left image and a right image comprising an object appearing in a path of a vehicle, wherein the left image and the right image is captured by a stereo camera coupled with the vehicle, and wherein the left image and the right image comprises a first-set of pixels and a second-set of pixels respectively, and wherein the first-set of pixels and the second-set of pixels are separately arranged in a first matrix and a second matrix respectively, wherein the first matrix and the second matrix comprises rows and columns;
semi-global matching module (216) to apply a semi-global matching technique on the left image and the right image by:
matching a subset of the first-set of pixels with a subset of the second-set of pixels, wherein the subset of the first set of pixels correspond to predefined rows and predefined columns in the first matrix, and
generating a disparity map based on the mapping, wherein the disparity map comprises disparity values based on the mapping of the subset of the first-set of pixels and the subset of second-set pixels; and
distance computing module (218) to compute, by using triangulation technique, the distance between the object and the vehicle based on the disparity values, focal length of the stereo camera, and baseline of the stereo camera.

5. The system (102) of claim 4, further comprising a Sobel filter (212) to determined edges of the left-image and the right-image.

6. The system (102) of claim 4, further comprising an interpolation module (214) to interpolate the left image and the right image to a predefined level based on parameters of the stereo camera and a target distance of the object, wherein the interpolation enables a sub-pixel level processing of the left image and the right image.

7. A non-transitory computer readable medium embodying a program executable in a computing device for determining a distance between a vehicle and an object appearing in a path of the vehicle, the program comprising:
a program code for receiving a left image and a right image comprising an object appearing in a path of a vehicle, wherein the left image and the right image is captured by a stereo camera coupled with the vehicle, and wherein the left image and the right image comprises a first-set of pixels and a second-set of pixels respectively, and wherein the first-set of pixels and the second-set of pixels are separately arranged in a first matrix and a second matrix respectively, wherein the first matrix and the second matrix comprises rows and columns;
a program code for applying a semi-global matching technique on the left image and the right image by:
matching a subset of the first-set of pixels with a subset of the second-set of pixels, wherein the subset of the first set of pixels correspond to predefined rows and predefined columns in the first matrix, and
generating a disparity map based on the mapping, wherein the disparity map comprises disparity values based on the mapping of the subset of the first-set of pixels and the subset of second-set pixels; and
a program code for computing, by using triangulation technique, the distance between the object and the vehicle based on the disparity values, a focal length of the stereo camera, and a baseline of the stereo camera.

Documents

Application Documents

# Name Date
1 1347-MUM-2014-Request For Certified Copy-Online(18-04-2015).pdf 2015-04-18
2 Form 3 [14-12-2016(online)].pdf 2016-12-14
3 OnlinePostDating.pdf 2018-08-11
4 Form 2.pdf 2018-08-11
5 Form 2 (Prov).pdf 2018-08-11
6 Figure for abstract.jpg 2018-08-11
7 Certified Copy_1347-MUM-2014.pdf 2018-08-11
8 1347-MUM-2014-FORM 26(30-5-2014).pdf 2018-08-11
9 1347-MUM-2014-FORM 1(7-10-2014).pdf 2018-08-11
10 1347-MUM-2014-CORRESPONDENCE(7-10-2014).pdf 2018-08-11
11 1347-MUM-2014-CORRESPONDENCE(30-5-2014).pdf 2018-08-11
12 1347-MUM-2014-FER.pdf 2019-10-21
13 1347-MUM-2014-OTHERS [21-04-2020(online)].pdf 2020-04-21
14 1347-MUM-2014-FER_SER_REPLY [21-04-2020(online)].pdf 2020-04-21
15 1347-MUM-2014-COMPLETE SPECIFICATION [21-04-2020(online)].pdf 2020-04-21
16 1347-MUM-2014-CLAIMS [21-04-2020(online)].pdf 2020-04-21
17 1347-MUM-2014-PatentCertificate14-02-2022.pdf 2022-02-14
18 1347-MUM-2014-IntimationOfGrant14-02-2022.pdf 2022-02-14
19 1347-MUM-2014-RELEVANT DOCUMENTS [30-09-2023(online)].pdf 2023-09-30

Search Strategy

1 searchs_1347_mum_2014_18-10-2019.pdf

ERegister / Renewals

3rd: 28 Feb 2022

From 16/04/2016 - To 16/04/2017

4th: 28 Feb 2022

From 16/04/2017 - To 16/04/2018

5th: 28 Feb 2022

From 16/04/2018 - To 16/04/2019

6th: 28 Feb 2022

From 16/04/2019 - To 16/04/2020

7th: 28 Feb 2022

From 16/04/2020 - To 16/04/2021

8th: 28 Feb 2022

From 16/04/2021 - To 16/04/2022

9th: 28 Feb 2022

From 16/04/2022 - To 16/04/2023

10th: 12 Apr 2023

From 16/04/2023 - To 16/04/2024

11th: 16 Apr 2024

From 16/04/2024 - To 16/04/2025

12th: 25 Mar 2025

From 16/04/2025 - To 16/04/2026