Abstract: The purpose of the present invention is to provide an environment recognition device that can recognize a detailed shape of a complicated road such as an intersection before reaching it, by using an environment measurement sensor and simple map information. An environment recognition device, for recognizing the environment of a traveling path on the basis of measurement data output from an environmental measurement sensor, comprises: a basic structure analysis unit which analyzes a basic structure of the traveling path on the basis of map information obtained from a general-purpose map or a server map; a detailed structure analysis unit which restores a detailed structure of the traveling path, on the basis of the basic structure of the traveling path; a traveling path correspondence control unit which, according to the basic structure of the traveling path, changes recognition logic for recognizing environmental information from the measurement data; and an external apparatus control unit which controls the environment measurement sensor or an external apparatus, on the basis of the environment information recognized by the traveling path correspondence control unit.
[0001]The present invention relates to an environment recognition device that recognizes a surrounding environment using an environment measurement sensor and map information and outputs information necessary for control and warning in advance. Background Art [0002]
In recent years, in automobiles, the adoption of preventive safety technologies that recognize the surrounding environment of the own vehicle is beginning to spread, and the response scenes of the preventive safety technologies are also expanding. In order to realize safer and more comfortable autonomous driving, in particular, more accurate environment recognition at an intersection is an important problem. However, since the conventional environment recognition technology using only an environment measurement sensor is basically intended for environment recognition on a single road, it is difficult to appropriately recognize a complicated road shape such as an
2
intersection. [0003]
In response to this problem, PTL 1 proposes an own vehicle position estimation device (ECU) including: an indication recognition unit that recognizes a road marking indicating a boundary of a lane ahead in a traveling route of an own vehicle on the basis of an imaging result by an imaging unit; an extraction unit that extracts an angle change unit that changes at a predetermined angle with respect to a direction along the lane from the recognized road marking as a candidate for a shape change unit indicating a shape change of the road; an information acquisition unit that acquires position information indicating a position where the shape change of the road occurs ahead in the traveling route; a narrowing unit that narrows down the extracted candidate to a candidate present within a predetermined distance in the direction along the lane from the position indicated by the position information; an indication specifying unit that specifies the narrowed candidate as a shape change unit ahead in the traveling route; and a position estimation unit that specifies an own vehicle position on a map on the basis of a branch point or a junction point of the identified lane” in order to “reduce the possibility of erroneously recognizing the start position of branching or merging” as described in the abstract of PTL 1.
3
[0004]
Then, by matching an own vehicle position estimated by the ECU (own vehicle position estimation device) of PTL 1 with a high-precision map created for automatic driving, a complicated road shape in a traveling direction is grasped in advance and used for preventive safety and automatic driving (for example, paragraphs 0024 and 0047 and the like of the same document). Citation List Patent Literature [0005]
PTL 1: JP 2018-40692A
Summary of Invention Technical Problem [0006]
As described above, PTL 1 requires a high-precision map for automatic driving in which a wide variety of information is registered. In a limited area such as an urban area or an automatic driving special zone, it is considered that maintenance of the high precision map for automatic driving will continue in the future. However, in a sparsely populated area or the like, it is considered that maintenance will be delayed due to cost problems such as surveying of a road shape. Therefore, it is considered that
4
an area where preventive safety and automatic driving can be realized by the technology of PTL 1 will be limited in the future. In addition, since it is considered that the use cost of the high-precision map service is relatively high, it is desirable to realize preventive safety and automatic driving similar to those in PTL 1 on the basis of a simple map that can be used at a lower cost. [0007]
In addition, an environment measurement sensor currently released as a product is basically assumed to measure the environment on a single road, and it is difficult to completely realize the environment recognition that can turn right or left at an intersection by automatic driving. This is because, in a case where the traveling road or the road edge is a single road, it is sufficient to perform the recognition processing on the assumption that there are two boundary lines on the left and right sides, but in an intersection or diverging/merging flow, there are a road orthogonal to the traveling direction, a road extending in an oblique direction, and the like, and it is difficult to recognize a complicated road shape with high accuracy without erroneous detection. [0008]
Therefore, an object of the present invention is to provide an environment recognition device capable of
5
recognizing a detailed shape of a complicated road such as
an intersection before the road reaches the intersection by
using an environment measurement sensor and simple map
information.
Solution to Problem [0009]
In order to solve the above problems, an environment recognition device of the present invention recognizes an environment of a traveling road on the basis of measurement data output from an environment measurement sensor. The environment recognition device includes: a basic structure analysis unit that analyzes a basic structure of the traveling road on the basis of map information obtained from a general-purpose map or a server map; a detailed structure analysis unit that restores a detailed structure of the traveling road on the basis of the basic structure of the traveling road; a traveling road correspondence control unit that changes a recognition logic for recognizing environmental information from the measurement data according to the basic structure of the traveling road; and an external device control unit that controls the environment measurement sensor or an external device on the basis of the environmental information recognized by the traveling road correspondence control unit.
6
Advantageous Effects of Invention [0010]
According to the environment recognition device of the present invention, it is possible to recognize a detailed shape of a complicated road such as an intersection before reaching the intersection by using an environment measurement sensor and simple map information.
Brief Description of Drawings [0011]
[FIG. 1] FIG. 1 is a functional block diagram of an in-vehicle environment recognition device according to an embodiment.
[FIG. 2] FIG. 2 is a functional block diagram of an environment measurement sensor.
[FIG. 3] FIG. 3 is a functional block diagram of a map information unit.
[FIG. 4] FIG. 4 is an example of an actual road network and a general-purpose map.
[FIG. 5] FIG. 5 is a functional block diagram of a traveling road analysis unit.
[FIG. 6] FIG. 6 is an example of basic structure analysis processing.
[FIG. 7] FIG. 7 is an example of detailed structure
7
restoration processing.
[FIG. 8] FIG. 8 is a functional block diagram of a detailed road shape addition unit.
[FIG. 9] FIG. 9 is an example of additional information priority.
[FIG. 10] FIG. 10 is a functional block diagram of a traveling road correspondence control unit.
[FIG. 11] FIG. 11 is a functional block diagram of a recognition processing selection unit.
[FIG. 12] FIG. 12 is a functional block diagram of a recognition processing setting unit.
[FIG. 13A] FIG. 13A is an example of a set road edge processing region.
[FIG. 13B] FIG. 13B is an example of a set lane processing region.
[FIG. 14] FIG. 14 is an example of model fitting processing of a traveling road.
[FIG. 15] FIG. 15 is an example of label addition.
[FIG. 16] FIG. 16 is a processing flowchart.
Description of Embodiments [0012]
Hereinafter, an in-vehicle environment recognition device 100 according to an embodiment of the present invention will be described with reference to the drawings.
8
The in-vehicle environment recognition device 100 of the present embodiment assists manual driving by a driver and automatic driving by an automatic driving system by providing a detailed structure of a traveling road in advance. [0013]
As illustrated in the functional block diagram of FIG. 1, an in-vehicle environment recognition device 100 of the present embodiment is connected to an external environment measurement sensor 1 and an external device 6, and includes a map information unit 2, a traveling road analysis unit 3, a traveling road correspondence control unit 4, and an external device control unit 5 therein. Note that the configuration exemplified here is merely an example, and the environment measurement sensor 1 and the external device 6 may be built in the in-vehicle environment recognition device 100, or the map information unit 2 may be separated from the in-vehicle environment recognition device 100. Hereinafter, each configuration will be described in detail after each configuration is outlined. [0014]
The in-vehicle environment recognition device 100 is actually realized by a computer (in the automobile field, it is also called an electronic control unit (ECU)) including hardware such as an arithmetic device such as a CPU, a main
9
storage device, an auxiliary storage device, and a communication device. Then, the arithmetic device executes a program loaded into the main storage device while referring to a database recorded in the auxiliary storage device, thereby realizing the functions of the map information unit 2, the traveling road analysis unit 3, and the like. Hereinafter, a well-known technique in the field of computers will be appropriately omitted. [0015]
The environment measurement sensor 1 is a sensor that measures the external environment of a vehicle and outputs measurement data. [0016]
The map information unit 2 mainly holds a general-purpose map Mg and a detailed map Md to be described later, and acquires an own vehicle position using GNSS information. Here, the general-purpose map Mg is a simple map, and is, for example, a map acquired from a car navigation system mounted on a vehicle. On the other hand, the detailed map Md is a map also including detailed environmental information not included in the general-purpose map Mg or the like, and is a map created by adding environmental information or the like recognized on the basis of the measurement data of the environment measurement sensor 1 to the general-purpose map Mg. Note that map information (hereinafter, the map is
10
referred to as a server map Ms) acquired from a server via a network may be held together with the general-purpose map Mg or instead of the general-purpose map Mg. The server map Ms is a map having the same amount of information as the general-purpose map Mg and having various data updated as needed by a map provider or the like. [0017]
The traveling road analysis unit 3 mainly analyzes the position and traveling direction of an own vehicle V on the general-purpose map Mg, and acquires in advance a basic structure (shape of an intersection, etc.) of a traveling road change point that the own vehicle V will encounter next. The traveling road analysis unit 3 further predicts a detailed structure such as a lane width, a shoulder position, and the number of lanes of the traveling road from the acquired basic structure and the like. [0018]
The traveling road correspondence control unit 4 mainly recognizes environmental information such as a road shape using measurement data of the environment measurement sensor 1 based on the detailed structure of the traveling road predicted by the traveling road analysis unit 3. More specifically, by dynamically changing a road shape recognition method, a processing area, and the like such as a traveling road and a road edge according to a road shape,
11
an error amount, and the like predicted by the traveling road analysis unit 3, the environment such as the road shape is recognized more stably and with high accuracy. [0019]
The external device control unit 5 mainly controls the environment measurement sensor 1 and the external device 6 on the basis of the determination of the traveling road correspondence control unit 4. In a case where the external device 6 is a display or a speaker, the in-vehicle environment recognition device 100 supports the driving of the driver by displaying a recognition result of the surrounding environment, displaying driving support information, an alarm for safety support, and the like. On the other hand, when the external device 6 is an automatic driving system, the in-vehicle environment recognition device 100 supports realization of more appropriate automatic driving by providing environmental information necessary for vehicle control in advance. [0020]
Details of the environment measurement sensor 1, the map information unit 2, the traveling road analysis unit 3, the traveling road correspondence control unit 4, and the external device control unit 5 outlined above will be sequentially described below.
12
As illustrated in FIG. 2, the environment measurement sensor 1 includes a front sensor 11, a matching unit 12, and a 3D point group acquisition unit 13. [0021]
The front sensor 11 is, for example, a stereo camera that is installed in the vehicle and captures an image of the front. The stereo camera is a camera in which a left camera 11L and a right camera 11R are provided side by side in the left-right direction at a predetermined interval. Note that, although an example in which a stereo camera is used as the front sensor 11 will be described below, the front sensor 11 may be any sensor that can measure the front of the own vehicle V, and may be a monocular camera or a LIDAR. [0022]
In a case where the same object is captured in the left and right images captured by the stereo camera, the matching unit 12 specifies image positions of the same object on the left and right images by matching processing, and specifies a difference in positions (parallax) captured on the images when viewed from the left and right cameras, thereby measuring the distance to the object. [0023]
The 3D point group acquisition unit 13 restores the three-dimensional position of the object by triangulation by
13
specifying a triangle with the left and right cameras as bases of the triangle and the position of the same object on the image as a vertex.
As illustrated in FIG. 3, the map information unit 2 includes a general-purpose map storage unit 21, a GNSS unit 22, a detailed map storage unit 23, and a detailed map update unit 24, and specifies the position of the own vehicle V on the map. [0024]
The general-purpose map storage unit 21 stores the general-purpose map Mg that is a simple map used in a car navigation system. The general-purpose map Mg expresses a road network by nodes that define positions on the map and links that connect the nodes. For example, when the road network around the own vehicle V is the environment illustrated in FIG. 4(a), the general-purpose map Mg corresponding to the road network is expressed by a combination of a plurality of nodes (white circles) and links (solid lines) as illustrated in FIG. 4(b). [0025]
The GNSS unit 22 specifies the position of the own vehicle V on the general-purpose map Mg expressed by the combination of the node and the link by using global navigation satellite system information (GNSS information).
14
Note that the position information specified using the GNSS information may be corrected using another sensor (for example, the behavior of the camera, the radar, the gyroscope, and the own vehicle V). [0026]
The detailed map storage unit 23 adds environmental information (for example, a lane width, a road angle, the number of lanes, a distance from an outermost lane to a shoulder, a traveling road shape, and the like.) measured using the environment measurement sensor 1 to the general-purpose map Mg and stores the added information as a detailed map Md. By using the detailed map Md stored here, in a case where the own vehicle V travels on the same traveling road next time, it is possible to more accurately and stably recognize the environment by using the detailed information stored in the previous travel. [0027]
The detailed map update unit 24 updates the detailed map Md stored in the detailed map storage unit 23 in a case where the map data of the general-purpose map Mg becomes old due to a new road or an abandoned road, a case where a road is temporarily closed due to construction, or the like. As a result, information such as a road that does not exist in the general-purpose map Mg but can travel and a road that exists in the general-purpose map Mg but cannot travel is
15
added to the detailed map Md. However, even when environmental information such as a new road is detected, it is not necessary to immediately update the detailed map Md by one detection, and the detailed map Md may be updated when the same environmental information is repeatedly detected.
As illustrated in FIG. 5, the traveling road analysis unit 3 includes a basic structure analysis unit 31 and a detailed structure analysis unit 32. The basic structure analysis unit 31 analyzes the basic structure of the road on the traveling road of the own vehicle V based on the general-purpose map Mg acquired from the general-purpose map storage unit 21. In addition, the detailed structure analysis unit 32 enhances (adds) information and predicts a road detailed shape on the basis of the basic structure analyzed by the basic structure analysis unit 31.
Each will be described in detail below.
As illustrated in FIG. 5, the basic structure analysis unit 31 includes a traveling road data reading unit 311, a node and link analysis unit 312, a basic road shape analysis unit 313, and an own vehicle position analysis unit 314. [0028]
The traveling road data reading unit 311 reads
16
information on nodes and links in the direction of the own vehicle traveling road from the general-purpose map Mg. For example, when the own vehicle V is traveling on the road network illustrated in FIG. 4(a), the traveling road data reading unit 311 reads node position information and link connection information as illustrated in FIG. 4(b). Note that, since the general-purpose map Mg is generally divided into map regions of, for example, 5 km square, the traveling road data reading unit 311 does not need to acquire the entire general-purpose map Mg, and acquires only information of nodes and links of 5 km square where the own vehicle V exists, thereby reducing the subsequent processing load. [0029]
The node and link analysis unit 312 represents a road network around the own vehicle V on a map as illustrated in FIG. 4(b) on the basis of the node position information and the link connection information acquired by the traveling road data reading unit 311. [0030]
The own vehicle position analysis unit 314 acquires the position of the own vehicle V, and includes a GNSS error analysis unit 314a, a sensor error analysis unit 314b, and a time-series integrated correction unit 314c. The GNSS error analysis unit 314a performs error analysis using the own vehicle behavior and the GNSS information, and analyzes
17
the error while comparing where the own vehicle V is traveling on the map with the map information, thereby correcting the own vehicle position with high accuracy. The sensor error analysis unit 314b analyzes errors in the vertical position and the horizontal position on the map by using information such as an inertial sensor (gyro), a camera, a millimeter wave, and LIDAR. The time-series integrated correction unit 314c performs position correction on the map using the error information acquired by the GNSS error analysis unit 314a and the sensor error analysis unit 314b. However, if the GNSS error and the sensor error are subjected to position correction with an instantaneous determination result, position update with uncertain information or unstable position update is likely to occur. Therefore, stable error correction is performed by analyzing the error in time series, and the error correction is performed in the time-series integrated correction unit 314c. [0031]
The basic road shape analysis unit 313 predicts a road structure of a change point (such as an intersection) which the own vehicle V is predicted to encounter next based on the general-purpose map Mg (FIG. 4(b)) represented by the node and link analysis unit 312 and the own vehicle position acquired by the own vehicle position analysis unit 314. For example, when the own vehicle V is traveling at a position
18
illustrated in FIG. 6(a) on the general-purpose map Mg, the basic road shape analysis unit 313 sets a region surrounded by a broken line in front of the own vehicle V as an analysis target. Then, the basic road shape analysis unit 313 selects a “cross” corresponding to the broken line region of FIG. 6(a) from a plurality of candidates of the basic structure illustrated in FIG. 6(b), predicts the “cross” as the basic structure of the change point that the own vehicle V will encounter next (FIG. 6(c)), and further predicts the distance to the cross and the like.
As illustrated in FIG. 5, the detailed structure analysis unit 32 includes a basic road shape reading unit 321, a detailed road shape addition unit 322, and a detailed shape update unit 323. [0032]
The basic road shape reading unit 321 acquires information (for example, a “cross”) of the basic structure predicted by the basic structure analysis unit 31. [0033]
The detailed road shape addition unit 322 adds the detailed shape and the like of the road to the basic structure acquired by the basic road shape reading unit 321 to restore the detailed structure. [0034]
19
FIG. 7 is a diagram specifically describing the addition of the detailed shape by the detailed road shape addition unit 322 and the restoration of the detailed structure. As illustrated in FIG. 7(a), when the basic structure predicted by the basic structure analysis unit 31 is a “cross”, the detailed road shape addition unit 322 restores the detailed structure by adding detailed shapes such as the number of lanes, a lane width, a distance from an outer lane to a shoulder, an angle between roads at an intersection, and a shape change of a traveling road such as an angle of a white line to the basic structure using values of detailed shapes included in the general-purpose map Mg, default values to be described later, a previous environment recognition result, and the like. [0035]
FIG. 7(b) is an example of a detailed structure restored on the basis of the basic structure. As illustrated herein, the detailed road shape addition unit 322 restores the detailed structure by carving the number of lanes of each road, the lane width of each lane, the shoulder position, a white line intersection angle θ1, a shoulder intersection angle θ2, and the like in the basic structure “cross” of FIG. 7(a). [0036]
For example, in a case where detailed shapes such as
20
the number of lanes, a lane width, a traveling road, and angle information between white lines are registered in the general-purpose map Mg, the detailed structure analysis unit 32 restores a detailed structure with high accuracy by using the detailed shapes. [0037]
On the other hand, in a case where some or all of the above-described detailed shapes are not registered in the general-purpose map Mg, the detailed structure analysis unit 32 restores the detailed structure using the default detailed shape or the like. The default detailed shape may be a fixed value (for example, the number of lanes is 2, the lane width is 3 m, the shoulder is 1.5 m, and the like), but a default value may be set on the basis of information such as the type and grade of the road registered in the general-purpose map Mg, or if information that can be used to restore the detailed structure is not registered in the general-purpose map Mg, an appropriate value may be set using indirect information. [0038]
As described above, since the detailed structure can be restored by further improving the accuracy of the detailed structure by the detailed structure analysis unit 32, the recognition processing in the traveling road correspondence control unit 4 to be described later and the feedback
21
information to the driver and the automatic driving system via the external device 6 can be made more appropriate. [0039]
Here, a detailed shape setting method will be described with reference to Tables 1 to 8. [0040]
Table 1 shows the types of roads prescribed in Article 3(1) of the Road Structure Order, and Table 2 shows the grades of roads prescribed in Article 3(2) and the like of the Road Structure Order. [0041] [Table 1]
Roads/Areas Suburban area Urban area
National expressway and motorway First type Second type
Other roads Third type Fourth type
[0042] [Table 2]
Planned traffic volume (vehicle/day) 20,000 or more 4,000 or
more and
less than
20,000 1,500 or
more and
less than
4,000 500 or
more and
less than
1,500 Less than 500
Road type Road shape
General road Flatland area First grade Second grade Third grade
Mountain area Second grade Third grade Fourth grade
Prefectura l road Flatland area Second grade Third grade
Mountain area Third grade Fourth grade
Municipal road Flatland area Second grade Third grade Fourth grade Fifth grade
Mountain area Third grade Fourth grade Fifth grade
[0043]
From these tables, it can be seen that, for example, a national expressway in an urban area is classified as a “second type” road, and for example, a general road in a
22
flatland area with a planned traffic volume of 20,000 vehicles/day or more is classified as a “first type” road. [0044]
When such type/grade information is registered in the general-purpose map Mg, the detailed road shape addition unit 322 can restore the detailed structure with high accuracy using the default values corresponding to the type information and the grade information. [0045]
For example, for a lane width or the like for which the Road Structure Order defines a legal value according to a combination of a type and a grade of a road, the legal value is used as it is as a default value. Specifically, by using the legal value (Table 3) of the lane width defined by Article 5(4) of the Road Structure Order or the legal value (Table 4) of the shoulder width defined by Article 8(2), (4), and (7) of the Road Structure Order, the detailed road shape addition unit 322 can accurately restore the detailed structure of the intersection or the like that complies with the legal value. [0046]
23
[Table 3]
Road classification Land width of
general road
[m]
First type First grade 3.50
Second grade 3.50
Third grade 3.50
Fourth grade 3.25
Second type First grade 3.50
Second grade 3.25
Third type First grade 3.50
Second grade 3.25
Third grade 3.00
Fourth grade 2.75
Fourth type First grade 3.25
Second grade, third grade 3.00
[0047]
[Table 4]
Type/grade classification Shoulder width
provided on left
of general road
[m] Shoulder width
provided on right
of general road
[m]
First type First and second grades 2.5 1.25
Third and fourth grades 1.75 0.75
Second type 1.25 0.75
Third type First grade 1.25 0.5
Second to fourth grades 0.75
Fifth grade 0.5
Fourth type 0.5 0.5
[0048]
However, the combination of the type and grade of the road is not registered in the general-purpose map Mg, and the legal value may not be obtained from Table 3 or Table 4. In such a case, the detailed road shape addition unit 322 switches the default value on the basis of other information registered in the general-purpose map Mg, thereby improving
24
the restoration accuracy of the detailed structure. [0049]
For example, when road type information is registered in the general-purpose map Mg but road grade information is not registered, a default value such as the number of lanes on one side is set using Table 5. On the contrary, when the grade information of the road is registered in the general-purpose map Mg but the type information of the road is not registered, a default value such as the number of lanes on one side is set using Table 6. As described above, even in a case where Tables 5 and 6 are used, it is useful information when roughly predicting the width of the road and the number of lanes, and it is possible to improve the restoration accuracy of the detailed structure as compared with a case where there is no information at all. [0050]
[Table 5]
Type Number of lanes on one side Land width [m] Roadside width [m]
Expressway Type 1 4 3.75 2.5
Type 2 3 3.5 1.75
General road Type 3 2 3.25 1.25
Type 4 1 3.0 0.75
[0051]
25
[Table 6]
Grade Number of lanes on one side Land width [m] Designed speed [km/h] Roadside width [m]
Grade 1 4 3.75 80 2.5
Grade 2 3 3.5 80 1.75
Grade 3 2 3.25 60 1.25
Grade 4 1 3.0 50 0.75
Grade 5 1 2.75 40 0.5
[0052]
Next, two default value setting methods in a case where information such as a direct lane width, the number of lanes, and a road shoulder for restoring a detailed structure is not registered in the general-purpose map Mg, and further, type information and grade information of a road are not registered (that is, in a case where none of Tables 3 to 6 is available,) will be exemplified. [0053]
For example, in a case where the type information and grade information of the road are not registered in the general-purpose map Mg, but the rough information on the type of road is registered, a default value such as the number of lanes on one side according to the type of road is set using Table 7. Note that, in a case where the default values in Table 7 are partially registered in the general-purpose map Mg, the default values in the general-purpose map Mg may be prioritized, and only the insufficient information may be cited from Table 7. In addition, rough information on the type of road is not registered in the
26
general-purpose map Mg, but when the speed limit information can be acquired from the general-purpose map Mg or the like, the default value such as the number of lanes on one side is dynamically switched according to the speed limit using Table 8. As described above, even in a case where the default value is dynamically set using Table 7 or Table 8, the restoration accuracy of the detailed structure can be improved as compared with a case where there is no information at all. [0054] [Table 7]
Road type Number of lanes on one side Land width [m] Designed
speed
[km/h] Roadside
width
[m]
Expressway 4 3.75 80 2.5
Capital freeway 3 3.5 80 1.75
National expressway 2 3.25 60 1.25
Prefectural road 1 3.0 50 0.75
Others 1 2.75 40 0.5
[0055] [Table 8]
Speed limit [km/h] Number of
lanes on one
side Land width [m] Designed speed [km/h] Roadside width [m]
100 4 3.75 80 2.5
80 3 3.5 80 1.75
60 2 3.25 60 1.25
50 1 3.0 50 0.75
40 1 2.75 40 0.5
[0056]
As illustrated in FIG. 5, the detailed shape update
27
unit 323 includes a road edge update unit 323a, a lane update
unit 323b, and an undulation update unit 323c.
[0057]
The road edge update unit 323a updates road edge information related to the positional relationship between the outermost lane and the road edge, and road edge information related to deformation of the shape of the lane or the road edge. The lane update unit 323b updates lane information that is not described in the general-purpose map information Mg, such as the number of lanes of the traveling road, a complicated change in a lane shape at an intersection or divergence and merge, information on a lane width, and an angle between the traveling roads. The undulation update unit 323c updates undulation information related to unevenness and inclination of the traveling road. [0058]
In addition to the update processing described above, each of these determines whether the environmental information may be reused when the vehicle travels the same place next time on the basis of the reliability of the environmental information such as road edges, lanes, and undulations of a traveling road. For example, the environmental information with low reliability is determined as provisional information to be used only for current road shape recognition by the traveling road correspondence
28
control unit 4, and is not transmitted to the map information unit 2. On the other hand, the environmental information with high reliability is determined to be not only used for current road shape recognition by the traveling road correspondence control unit 4 but also reused thereafter, and is transmitted to the map information unit 2 together with the reliability thereof and stored in the detailed map Md. [0059]
As a specific example of the reuse possibility determination of the environmental information, when a convex shape orthogonal to the traveling direction is measured on a road heading for a residential area, the undulation update unit 323c determines that there is a high possibility that the convex shape is a speed suppression BUMP or HUMP, and transmits undulation information regarding the convex shape to the map information unit 2 as environmental information having high reliability and to be registered in the detailed map Md. [0060]
Note that even environmental information with high reliability does not need to be immediately transmitted to the map information unit 2 only once measured, and may be transmitted to the map information unit 2 when the same information is repeatedly measured. This is to increase the
29
reliability of the detailed map Md by registering only reliable environmental information repeatedly measured at the same place since the environmental information measured only once detects a provisional lane width during construction or fails to detect a white line hidden by a parked vehicle. [0061]
In addition, in a case where the reliability of the environmental information obtained by the current environment recognition is extremely high and the degree of coincidence with the environmental information already stored in the map information unit 2 is also high, the detailed shape on the detailed map Md may be updated by fusing the existing environmental information and the new environmental information.
Next, further details of the detailed road shape addition unit 322 of which specific processing has been described with reference to FIG. 7 will be described with reference to FIG. 8. As illustrated here, the detailed road shape addition unit 322 includes a priority adjustment unit 322a, an additional information selection unit 322b, and a shape restoration unit 322c. [0062]
As described with reference to FIG. 7, the detailed
30
road shape addition unit 322 adds detailed shapes such as a lane width, the number of lanes, and road shoulder information to the basic structure acquired by the basic road shape reading unit 321 to restore the detailed structure. However, when there are a plurality of information sources of detailed shapes, it is necessary to determine which information source is preferentially used. [0063]
Therefore, the priority adjustment unit 322a adjusts the priority of the information source used when the detailed structure of the road is restored. [0064]
As illustrated in the “fixed environmental information” column in FIG. 9, an information source with the highest priority (with a smaller numerical value of priority) among the available information sources is used for the environmental information with a fixed value. For example, taking the “lane width” as an example, when the “lane width” for each road is set in the server map Ms acquired from the server, the “lane width” information of the server map Ms is used (priority 1). When the “lane width” information cannot be acquired from the server map Ms and the “lane width” information can be acquired from the general-purpose map Mg, the “lane width” information of the general-purpose map Mg is used (priority 2). When the “lane width” information
31
cannot be acquired from either the server map Ms or the general-purpose map Mg, the highest priority value among the legal values and default values defined in Tables 3 to 8 is used (priority 3 to priority 5). Further, when Tables 3 to 8 are not available, a fully fixed default value (in the existing example, environmental information such as the number of lanes of 2, a lane width of 3 m, and a road shoulder of 1.5 m) may be used. [0065]
Further, as described above, since the detailed shape update unit 323 registers, in the detailed map Md of the map information unit 2, the environmental information exceeding a certain degree of reliability among the recognized environmental information, the “lane width” of the next change point can be grasped in advance by reusing the environmental information registered in the detailed map Md. [0066]
Note that, in FIG. 9, the superiority and inferiority of the priority are expressed by integers, but a priority that is not an integer, such as the priority 1.4, may be set. For example, while an integer priority is set to the fixed environmental information registered in the server map Ms, the general-purpose map Mg, or the like, a priority that is not limited to an integer corresponding to the reliability within the range of the priority 1 to the priority 4 may be
32
set to the dynamic environmental information in which the reliability fluctuates, which is recognized by the measurement data of the environment measurement sensor 1, so that the environmental information with a small priority can be used. [0067]
Further, the case where the detailed shape to be added to the basic structure is “lane width” has been described above as an example. However, even in a case where “number of lanes” or “road shoulder” is added, the detailed structure can be set using the priority. In addition, since there is missing information depending on the information source, in that case, the detailed structure including desired information may be restored by integrating the information acquired from the plurality of information sources. [0068]
In this manner, the detailed road shape addition unit 322 updates the priority of the information stored in the detailed map Md according to the reliability of each environmental information. In a case where there is no environmental information stored in the past on the detailed map Md, the environmental information with a certain degree of reliability recognized by the measurement data of the environmental measurement sensor 1 is set and registered as the priority 4. On the other hand, when the environmental
33
information within the error range is updated, the values of the environmental information are averaged and updated, and the priority is gradually increased. The data stores the value of the environmental information every time, deletes the outlier, and lowers the priority. The priority is increased except for the outlier, and the average value of the environment recognition result is stored. Alternatively, the filter design may be such that the environment recognition result is filtered, the weight is increased to the current value, and the past value is decreased. In addition, in a case where an outlier is continuously input, there is a case where the conventional value is wrong, or the data of the traveling road is actually updated due to construction of the road. Therefore, in a case where a time equal to or longer than a certain fixed time is planned, and in a case where significantly different data is continuously input five times or more, the priority is initialized to the priority 4, and at the same time, the value is rewritten to new data. In this way, data update due to actual road construction or the like is also handled.
As illustrated in FIG. 10, the traveling road
correspondence control unit 4 includes a recognition
processing selection unit 41, a recognition processing
setting unit 42, a road edge recognition unit 43, a lane
34
recognition unit 44, an exposure adjustment unit 45, a camera posture correction unit 46, and a sensor abnormality detection unit 47. [0069]
The traveling road correspondence control unit 4 recognizes the road shape on the basis of the measurement data of the environment measurement sensor 1 on the basis of the detailed structure of the change point (intersection or the like) acquired in advance from the traveling road analysis unit 3, dynamically changes the recognition logic of the road shape recognition processing initially set assuming a single road in accordance with the road shape of the traveling road, and recognizes the environmental information by a more appropriate recognition logic. Similarly, processing based on the detailed structure of the traveling road is performed for exposure adjustment, camera posture correction, and sensor abnormality detection, so that more appropriate processing than original processing can be performed. Each will be described in detail below.
The recognition processing selection unit 41 selects a recognition logic (a processing region number and a feature amount extraction method) according to the basic structure of the traveling road, and as illustrated in FIG. 11, includes a road edge processing region number selection unit
35
411, a road edge extraction method selection unit 412, a lane processing region number selection unit 413, and a lane extraction method selection unit 414. Note that, in the following description, the processing data of the traveling road correspondence control unit 4 will be described as image data captured by the stereo camera, but the processing data may be another type of data such as 3D point cloud data acquired from the environment measurement sensor 1. [0070]
The road edge processing region number selection unit 411 selects the number of regions for which the road end extraction process is to be executed according to the basic structure of the traveling road. When the front sensor 11 is a stereo camera, about 100 m ahead of the own vehicle V is measured. Therefore, for example, if the traveling road within 100 m ahead is a single path, it is assumed that there are two parallel road edges on the left and right of the own vehicle V, and the road edge processing region number “2” is selected so that they can be extracted. If there is a cross on the traveling road within 100 m ahead, it is assumed that there are two parallel road edges on the left and right of the own vehicle V and there are two parallel road edges in front of the own vehicle V, and the road edge processing region number “4” is selected so that they can be extracted. [0071]
36
Similarly, the lane processing region number selection unit 413 selects the number of regions for which the lane extraction process is to be executed according to the basic structure of the traveling road. For example, if the traveling road within 100 m ahead is a single road, it is assumed that there are lanes along the traveling direction of the own vehicle V, and the lane processing region number “1” is selected so that these lanes can be extracted. If there is a cross on the traveling road within 100 m ahead, it is assumed that there are lanes along the traveling direction of the own vehicle V and lanes perpendicular to the traveling direction of the own vehicle V on the front, back, left, and right of the intersection, and the lane processing region number “4” is selected so that these lanes can be extracted. [0072]
Further, the road edge extraction method selection unit 412 selects a method of extracting the road edge feature amount according to the basic structure of the traveling road. For example, when the traveling road within 100 m ahead is a single path, a road edge extraction method of searching the set road edge processing region in the lateral direction is set. If there is a cross in the traveling road within 100 m ahead, a road edge extraction method of searching in the lateral direction is selected in the road
37
edge processing region along the traveling direction of the own vehicle V, and a road edge extraction method of searching in the longitudinal direction is selected in the road edge processing region perpendicular to the traveling direction of the own vehicle V. [0073]
Similarly, the lane extraction method selection unit 414 selects a method of extracting the lane feature amount according to the basic structure of the traveling road. For example, if the traveling road within 100 m ahead is a single path, the lane extraction method of searching the set lane processing region in the lateral direction is selected. If there is a cross in the traveling road within 100 m ahead, a lane extraction method of searching in the lateral direction is selected in the land processing region along the traveling direction of the own vehicle V, and a lane feature amount extraction method of searching in the longitudinal direction is selected in the lane processing region perpendicular to the traveling direction of the own vehicle V.
As illustrated in FIG. 12, the recognition processing setting unit 42 includes a road edge processing region setting unit 421, a per-road-edge-region extraction method setting unit 422, a lane processing region setting unit 423,
38
and a per-lane-region extraction method setting unit 424, and each setting unit can process measurement data of the environment measurement sensor 1 using the processing region number and the feature amount extraction method selected by the recognition processing selection unit 41. [0074]
For example, when there is a cross on the traveling road within 100 m ahead, the road edge processing region setting unit 421 sets, as the road edge processing region, four regions (see FIG. 13A) having such a size that a shoulder of a step at a boundary between the traveling road and the sidewalk is included, to the image data, based on the selection of the road edge processing region number selection unit 411. Further, under the same conditions, the lane processing region setting unit 423 sets, as the lane processing region, four regions (see FIG. 13B) having such a size that lanes on front, rear, left, and right roads of the intersection are included, to the image data, based on the selection by the lane processing region number selection unit 413. Further, the per-road-edge-region extraction method setting unit 422 and the per-lane-region extraction method setting unit 424 set the feature amount extraction method selected by the road edge extraction method selection unit 412 and the lane extraction method selection unit 414 as the feature amount extraction method for each region.
39
[0075]
In consideration of the influence of the GNSS error, the self-position correction by the environment measurement sensor 1, the vehicle body pitching, and the like, it is desirable that the size of the processing region set by the road edge processing region setting unit 421 and the lane processing region setting unit 423 has a margin in consideration of the positional deviation on the image.
The road edge recognition unit 43 recognizes the road edge using the processing region and the feature amount extraction method set by the recognition processing setting unit 42. As the road edge feature amount, a step portion that is three-dimensionally high or low with respect to the traveling road is extracted as the feature amount. Therefore, when there are a shoulder and a wall or the like on the side of the road with respect to the traveling road, both feature amounts are extracted. In addition, various three-dimensional objects such as a wall, a building, a tree, a utility pole, and a side groove are extracted as the feature amount, and a step that first comes into contact with the own vehicle V as the center is important as a step that is important for the own vehicle V traveling on the traveling road, and it is important to travel so as not to come into contact with the step. For this reason, the noise removal
40
processing is performed to emphasize the step that may come in contact with the own vehicle V at the beginning when the own vehicle V is considered as the center, and to delete the three-dimensional object existing at a position farther from the own vehicle V than the step. After the execution, a step with which the own vehicle V is likely to contact first is used as a point group at the road edge around the own vehicle V. Further, a group of road edge points of the traveling road is configured in time series by using the own vehicle behavior or the corresponding point of the image. Model fitting is performed on the group of road edge points and the detailed road structure configured in this time series. [0076]
FIG. 14 is an example of the model fitting processing performed by the road edge recognition unit 43. Even if the detailed structure of the traveling road acquired by the traveling road correspondence control unit 4 from the traveling road analysis unit 3 is an orthogonal cross path as illustrated in FIG. 14(a), the actual shape of the cross path is often different. Therefore, the road edge recognition unit 43 adjusts the position of the group of road model control points (the white circles in FIG. 14(b)) that define the road shape in consideration of the position of the group of road edge points (the black circles in FIG.
41
14 (b)) recognized from the measurement data of the environment measurement sensor 1. In the example of FIG. 14(b), since the group of road edge points of the road in the lateral direction in the drawing is observed on a straight line slightly upward to the right, the road edge recognition unit 43 performs the model fitting processing of moving the position of the road model control points of the cross on a dotted line slightly upward to the right to form a new detailed structure. [0077]
As described above, the road edge recognition unit 43 of the present embodiment not only acquires the detailed structure of the road in the traveling direction from the traveling road analysis unit 3 in advance, and uses the road edge processing region and the road edge feature amount extraction method according to the detailed structure, but also performs the model fitting processing on the acquired detailed structure, thereby making it possible to accurately recognize the road edge of the complicated traveling road, which is basically difficult to accurately recognize by the conventional road edge recognition technology for processing a single road.
The lane recognition unit 44 recognizes a lane using the processing region and the feature amount extraction
42
method set by the recognition processing setting unit 42. Although redundant description of the same points as those of the road edge recognition unit 43 will be omitted, if differences between the two recognition units are listed, the lane recognition unit 44 recognizes the lane by removing noise from the feature amount extracted from the processing region and the feature amount extraction method set by the recognition processing setting unit 42. The lane feature amount detects a point that is a feature of a white line that is a boundary line along the traveling direction. Since the traveling road is roughly straight and there are few extreme curves, it is possible to roughly search for the position of the lane and the number of candidates by linear search in rough search. By performing such processing, noise removal is performed. Further, the lane continuously connected in time series is restored by using the behavior of the own vehicle.
The exposure adjustment unit 45 corrects the exposure of the stereo camera or the like using information useful for determining the exposure condition, such as an approximate value of the width to the road shoulder of the traveling road. [0078]
In the conventional exposure adjustment, the
43
brightness of the current image is not measured according to the traveling road, but basically, the brightness of the road surface is measured using the road surface in which the lane does not enter as a processing region, and the exposure amount of the next frame is adjusted according to the brightness of the road surface, so that the parallax of the road surface is not reduced as much as possible. Alternatively, even if there is a white line or a road surface paint, the exposure amount is adjusted according to the brightness of the road surface by using the mode value. Basically, the exposure can be more stably adjusted by analyzing the brightness of the road surface in a wider range. However, since the shape of the traveling road is unknown, the exposure is basically analyzed in the own lane or in a processing region slightly protruding from the own lane. [0079]
However, when an approximate value of the width to the shoulder can be obtained by the detailed shape analysis described above, a processing region for analyzing the brightness of the traveling road for adjusting the exposure is determined using the result of the detailed shape analysis or the previous environment recognition result. As a result, it is possible to analyze the road surface brightness in a range as wide as possible according to the traveling road, and it is possible to realize more stable exposure adjustment.
44
When the inclination information of the traveling road is registered in the general-purpose map Mg, the camera posture correction unit 46 uses the inclination information to correct the posture of the environment measurement sensor 1. In addition, if there is information such as a lane width and a shoulder width, the height information of the camera and the like can be corrected using the information. [0080]
In addition, since the relative posture between the camera and the road surface is obtained correctly, it is possible to more appropriately set the processing region and the like.
The sensor abnormality detection unit 47 detects an abnormality of the environment measurement sensor 1 using map information such as the detailed map Md acquired from the map information unit 2. Since the sensor abnormality detection unit 47 can acquire a detailed structure such as a white line and a road shoulder at an intersection or the like in the traveling direction from the map information in advance, the sensor abnormality detection unit determines that there is a possibility of abnormality in the environment measurement sensor 1 in a case where a feature such as a white line or a road shoulder that can be detected at the
45
time of passing through the intersection or the like cannot be detected. Then, in a case where an abnormality is repeatedly determined within a short time, the abnormality of the environment measurement sensor 1 is detected, and the abnormality of the environment measurement sensor 1 is notified to the driver and the automatic driving system.
Although the basic control by the traveling road correspondence control unit 4 has been described above, in a case where a plurality of lanes are recognized on the traveling road using the recognition logic described above, or in a case where a landmark necessary to be considered when performing automatic driving is recognized, the detailed structure analysis unit 32 may give a label L indicating the type to each of the recognized lanes and landmarks so that the meaning of the environment in which the own vehicle V is traveling can be recognized. For example, as illustrated in FIG. 15, a label L1 indicating a general traveling road for a vehicle not equipped with the ETC device, a label L2 indicating a traveling road for a vehicle equipped with the ETC device, a label L3 indicating a position of the gate, a label L4 indicating an oncoming traveling road, and the like are given to a lane or a landmark recognized when the lane or the landmark is recognized when
46
the lane or the landmark is directed to a gate of a highway. [0081]
When these labels are provided, the own vehicle V during automatic traveling can not only prevent wrong-way-driving on the opposite traveling road, but also select an appropriate traveling road according to the mounting situation of the ETC device, and can realize automatic driving in which the own vehicle V stops or decelerates when passing through the gate.
Next, an example of a processing flow performed by the in-vehicle environment recognition device 100 of the present embodiment will be described with reference to FIG. 16. [0082]
First, in S1, the front sensor 11 of the environment measurement sensor 1 captures left and right images in front of the own vehicle V with the stereo camera. [0083]
In S2, the matching unit 12 of the environment measurement sensor 1 performs stereo matching on the left and right images captured by the stereo camera. [0084]
In S3, the 3D point group acquisition unit 13 of the environment measurement sensor 1 acquires a 3D point group on the basis of the stereo matching result by the matching
47
unit 12. [0085]
In S4, the basic structure analysis unit 31 of the traveling road analysis unit 3 acquires a basic road shape including a node and a link in the vicinity of the own vehicle V from the general-purpose map information Mg held by the map information unit 2. [0086]
In S5, the detailed structure analysis unit 32 of the traveling road analysis unit 3 restores the detailed road shape on the basis of the basic road shape acquired by the basic structure analysis unit 31. Alternatively, in a case where the information of the result of the environment recognition is stored, which one of the bases is used to restore the detailed road shape is determined according to the priority. [0087]
In S6, the recognition processing selection unit 41 of the traveling road correspondence control unit 4 selects a recognition logic for recognizing a road edge or a lane according to a road shape. [0088]
In S7, the recognition processing setting unit 42 of the traveling road correspondence control unit 4 sets a recognition logic for recognizing a road edge and a lane
48
according to a road shape. [0089]
In S8, the road edge recognition unit 43 of the traveling road correspondence control unit 4 executes the road end recognition processing in response to a change in part of the recognition processing by the recognition processing setting unit 42, a change result of the processing region or position by parameter adjustment, or the like. [0090]
In S9, the lane recognition unit 44 of the traveling road correspondence control unit 4 executes the lane recognition processing in response to a change in part of the recognition processing by the recognition processing setting unit 42, a change result of the processing region or position by parameter adjustment, or the like. The order of S8 and S9 may be changed. [0091]
In S10, the detailed structure analysis unit 32 of the traveling road analysis unit 3 first determines whether the environmental information regarding the lane and the road edge can be reflected in the detailed map Md according to the reliability of the recognition result of the lane and the road edge. [0092]
When it is determined that the detailed map Md can be
49
updated, in S11, the detailed map update unit 24 determines whether to update the general-purpose map Mg in comparison with the result when there is information stored based on the result recognized up to now. For example, in a case where the storage result is greatly different from the conventional storage result, the final map is reflected on the general-purpose map Mg after confirming that such a result continues many times. [0093]
In S12, the external device control unit 5 uses the environmental information recognized this time to provide the driver with information regarding the traveling road before arriving at an intersection or the like, or to provide the automatic driving system with information necessary for realizing autonomous driving in advance. [0094]
As described above, according to the in-vehicle environment recognition device of the present embodiment, it is possible to recognize a detailed shape of a complicated road such as an intersection before reaching the intersection by using an environment measurement sensor mounted on a vehicle and simple map information. Reference Signs List [0095] 1 environmental measurement sensor
50
11 front sensor 11L left camera 11R right camera
12 matching unit
133D point group acquisition unit
2 map information unit
21 general-purpose map storage unit
22 GNSS unit
23 detailed map storage unit
24 detailed map update unit
3 traveling road analysis unit
31 basic structure analysis unit
311 traveling road data reading unit
312 node and link analysis unit
313 basic road shape analysis unit
314 own vehicle position analysis unit 314a GNSS error analysis unit
314b sensor error analysis unit
314c time-series integrated correction unit
32 detailed structure analysis unit
321 basic road shape reading unit
322 detailed road shape addition unit 322a additional information priority unit 322b additional information selection unit 322c shape restoration unit
51
323 detailed shape update unit 323a road edge update unit 323b lane update unit 323c undulation update unit
4 road shape recognition unit
41 recognition processing selection unit
411 road edge processing region number selection unit
412 road edge extraction method selection unit
413 lane processing region number selection unit
414 lane extraction method selection unit
42 recognition processing setting unit
421 road edge processing region setting unit
422 per-road-edge-region extraction method setting unit
423 lane processing region setting unit
424 per-lane-region extraction method setting unit
43 road edge recognition unit
44 lane recognition unit
45 exposure adjustment unit
46 camera posture correction unit
47 sensor abnormality detection unit
5 external device control unit
6 external device
52
WE CLAIMS
An environment recognition device that recognizes an environment of a traveling road on the basis of measurement data output from an environment measurement sensor, the environment recognition device comprising:
a basic structure analysis unit that analyzes a basic structure of the traveling road on the basis of map information obtained from a general-purpose map or a server map;
a detailed structure analysis unit that restores a detailed structure of the traveling road on the basis of the basic structure of the traveling road;
a traveling road correspondence control unit that changes a recognition logic for recognizing environmental information from the measurement data according to the basic structure of the traveling road; and
an external device control unit that controls the environment measurement sensor or an external device on the basis of the environmental information recognized by the traveling road correspondence control unit. [Claim 2]
The environment recognition device according to claim 1, wherein
the map information is node information and link
53
information, and
the detailed structure is restored by adding any environmental information of the number of lanes, a width of a lane, a shoulder position, a white line intersection angle, and a shoulder intersection angle to a basic structure including the node information and the link information. [Claim 3]
The environment recognition device according to claim 1, wherein
a recognition logic changed according to the basic structure of the traveling road is the number of processing regions, a place of the processing region, or a feature amount extraction method in recognition processing of a road edge or a lane. [Claim 4]
The environment recognition device according to claim
3, wherein
the recognition logic includes at least one of feature amount extraction processing, noise removal processing, and model fitting processing according to a place of the processing region. [Claim 5]
The environment recognition device according to claim
4, wherein
the model fitting processing is processing of setting
54
a white line intersection angle or a shoulder intersection angle included in a detailed structure of the traveling road on the basis of a three-dimensional arrangement of feature amounts extracted in the feature amount extraction processing. [Claim 6]
The environment recognition device according to claim
1, wherein
the detailed structure analysis unit restores the detailed structure of the traveling road by providing a label indicating a type of a lane or a landmark recognized by the traveling road correspondence control unit to the lane or the landmark. [Claim 7]
The environment recognition device according to claim
2, wherein
the detailed structure analysis unit restores a value of the environmental information on the basis of type information or grade information determined by a road structure order acquired from the map information. [Claim 8]
The environment recognition device according to claim 2, wherein
the detailed structure analysis unit restores a value of the environmental information on the basis of road type
55
information or speed limit information acquired from the map
information.
[Claim 9]
The environment recognition device according to claim 2, wherein
the detailed structure analysis unit restores a value of the environmental information using a value acquired from the map information and obtained in past recognition processing at the place. [Claim 10]
The environment recognition device according to claim 2, wherein
the detailed structure analysis unit selects a value having a highest priority set for each value to restore the detailed structure,
in a case where a plurality of values, among a value based on type information or grade information specified by a road structure order,
a value based on road type information or speed limit information, or
a value obtained in past recognition processing at the
place, can be used.
| # | Name | Date |
|---|---|---|
| 1 | 202117042140-IntimationOfGrant06-02-2024.pdf | 2024-02-06 |
| 1 | 202117042140-TRANSLATIOIN OF PRIOIRTY DOCUMENTS ETC. [17-09-2021(online)].pdf | 2021-09-17 |
| 2 | 202117042140-STATEMENT OF UNDERTAKING (FORM 3) [17-09-2021(online)].pdf | 2021-09-17 |
| 2 | 202117042140-PatentCertificate06-02-2024.pdf | 2024-02-06 |
| 3 | 202117042140-REQUEST FOR EXAMINATION (FORM-18) [17-09-2021(online)].pdf | 2021-09-17 |
| 3 | 202117042140-ABSTRACT [08-06-2022(online)].pdf | 2022-06-08 |
| 4 | 202117042140-PROOF OF RIGHT [17-09-2021(online)].pdf | 2021-09-17 |
| 4 | 202117042140-CLAIMS [08-06-2022(online)].pdf | 2022-06-08 |
| 5 | 202117042140-PRIORITY DOCUMENTS [17-09-2021(online)].pdf | 2021-09-17 |
| 5 | 202117042140-DRAWING [08-06-2022(online)].pdf | 2022-06-08 |
| 6 | 202117042140-POWER OF AUTHORITY [17-09-2021(online)].pdf | 2021-09-17 |
| 6 | 202117042140-FER_SER_REPLY [08-06-2022(online)].pdf | 2022-06-08 |
| 7 | 202117042140-NOTIFICATION OF INT. APPLN. NO. & FILING DATE (PCT-RO-105-PCT Pamphlet) [17-09-2021(online)].pdf | 2021-09-17 |
| 7 | 202117042140-FORM 3 [08-06-2022(online)].pdf | 2022-06-08 |
| 8 | 202117042140-OTHERS [08-06-2022(online)].pdf | 2022-06-08 |
| 8 | 202117042140-FORM 18 [17-09-2021(online)].pdf | 2021-09-17 |
| 9 | 202117042140-FORM 1 [17-09-2021(online)].pdf | 2021-09-17 |
| 9 | 202117042140-FER.pdf | 2022-04-06 |
| 10 | 202117042140-DRAWINGS [17-09-2021(online)].pdf | 2021-09-17 |
| 10 | 202117042140-FORM 3 [04-02-2022(online)].pdf | 2022-02-04 |
| 11 | 202117042140-DECLARATION OF INVENTORSHIP (FORM 5) [17-09-2021(online)].pdf | 2021-09-17 |
| 11 | 202117042140.pdf | 2021-10-22 |
| 12 | 202117042140-COMPLETE SPECIFICATION [17-09-2021(online)].pdf | 2021-09-17 |
| 13 | 202117042140-DECLARATION OF INVENTORSHIP (FORM 5) [17-09-2021(online)].pdf | 2021-09-17 |
| 13 | 202117042140.pdf | 2021-10-22 |
| 14 | 202117042140-DRAWINGS [17-09-2021(online)].pdf | 2021-09-17 |
| 14 | 202117042140-FORM 3 [04-02-2022(online)].pdf | 2022-02-04 |
| 15 | 202117042140-FER.pdf | 2022-04-06 |
| 15 | 202117042140-FORM 1 [17-09-2021(online)].pdf | 2021-09-17 |
| 16 | 202117042140-FORM 18 [17-09-2021(online)].pdf | 2021-09-17 |
| 16 | 202117042140-OTHERS [08-06-2022(online)].pdf | 2022-06-08 |
| 17 | 202117042140-FORM 3 [08-06-2022(online)].pdf | 2022-06-08 |
| 17 | 202117042140-NOTIFICATION OF INT. APPLN. NO. & FILING DATE (PCT-RO-105-PCT Pamphlet) [17-09-2021(online)].pdf | 2021-09-17 |
| 18 | 202117042140-FER_SER_REPLY [08-06-2022(online)].pdf | 2022-06-08 |
| 18 | 202117042140-POWER OF AUTHORITY [17-09-2021(online)].pdf | 2021-09-17 |
| 19 | 202117042140-DRAWING [08-06-2022(online)].pdf | 2022-06-08 |
| 19 | 202117042140-PRIORITY DOCUMENTS [17-09-2021(online)].pdf | 2021-09-17 |
| 20 | 202117042140-PROOF OF RIGHT [17-09-2021(online)].pdf | 2021-09-17 |
| 20 | 202117042140-CLAIMS [08-06-2022(online)].pdf | 2022-06-08 |
| 21 | 202117042140-REQUEST FOR EXAMINATION (FORM-18) [17-09-2021(online)].pdf | 2021-09-17 |
| 21 | 202117042140-ABSTRACT [08-06-2022(online)].pdf | 2022-06-08 |
| 22 | 202117042140-STATEMENT OF UNDERTAKING (FORM 3) [17-09-2021(online)].pdf | 2021-09-17 |
| 22 | 202117042140-PatentCertificate06-02-2024.pdf | 2024-02-06 |
| 23 | 202117042140-TRANSLATIOIN OF PRIOIRTY DOCUMENTS ETC. [17-09-2021(online)].pdf | 2021-09-17 |
| 23 | 202117042140-IntimationOfGrant06-02-2024.pdf | 2024-02-06 |
| 1 | searchE_06-04-2022.pdf |