Sign In to Follow Application
View All Documents & Correspondence

Driving Assistance Device And Driving Assistance System

Abstract: The purpose of the present invention is to realize lane deviation warning/control at an intersection and on a multilane road and in an environment where no lane dividing lines are present and GNSS accuracy is low. Provided is a driving assistance device comprising: a recording unit that records a plurality of past traveling positions of a vehicle and position estimation information relating to the past traveling positions; a relative position estimation unit that estimates a relative position of the vehicle from an output of a sensor for detecting the surrounding and that adds history of the relative position of the vehicle to the past traveling positions and adds the output of the sensor used for estimating the relative position to the position estimation information; a position estimation unit that estimates, from the output of the sensor and the position estimation information of the recording unit, a position of the vehicle with respect to the past traveling positions and that adds history of the position to the past traveling positions; a deviation determination unit that, by using a distribution of the plurality of past traveling positions, the positions of the vehicle with respect to the plurality of past traveling positions, and a predetermined reference value, determines a deviation from the plurality of past traveling positions; and a control unit that, when the deviation has been determined, controls the vehicle so as to cancel warning or the deviation.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
07 June 2022
Publication Number
41/2022
Publication Type
INA
Invention Field
ELECTRONICS
Status
Email
archana@anandandanand.com
Parent Application
Patent Number
Legal Status
Grant Date
2024-12-03
Renewal Date

Applicants

HITACHI ASTEMO, LTD.
2520, Takaba, Hitachinaka-shi, Ibaraki 3128503

Inventors

1. KUME Hideyuki
c/o HITACHI, LTD., 6-6, Marunouchi 1-chome, Chiyoda-ku, Tokyo 1008280
2. TOYODA Hidehiro
c/o HITACHI AUTOMOTIVE SYSTEMS, LTD., 2520, Takaba, Hitachinaka-shi, Ibaraki 3128503

Specification

Title of Invention: Driving Support Device and Driving Support System
Technical field
[0001]
 The present invention relates to a driving assistance device and a driving assistance system.
Background technology
[0002]
 One of the driving support functions of automobiles is lane departure warning/control, which issues a warning to the driver when a departure from the lane is detected, and implements control to eliminate the departure. In this type of lane departure warning and control, a method of implementing warning and control based on lane markings detected by onboard sensors is widely used. Although there are lane markings, there is a problem that it is difficult to perform warnings and recovery control in environments where it is difficult for the in-vehicle sensor to detect the lane markings due to blurring.
[0003]
 As an example of a technique for solving this problem, in paragraphs 0007 and 0008 of Patent Document 1, "This invention refers to past travel locus data of a power navigation system and uses LKA/LDW information from past travel locus information. For vehicles that can perform lane keeping assistance or lane departure prevention even when lane information cannot be detected from images captured by the camera, or when the map information has not been updated and the vehicle is driving on a road that is not included in the map information. An object of the present invention is to realize a control device.”, “Therefore, in order to eliminate the above inconvenience, the present invention is a vehicle control device that performs lane keeping assistance or lane departure prevention, and stores the travel locus of the vehicle. a traveling locus storage means; a position detecting means for detecting the current position of the vehicle; and based on the traveling locus stored in the traveling locus storing means and the current position of the vehicle detected by the position detecting means. and lane deviation determination means for determining the degree of deviation."
[0004]
 That is, in Patent Document 1, by determining the departure from the lane based on the past driving position and the current position of the vehicle, even in an environment where there are no lane markings, it is possible to realize a warning and return control when the vehicle departs from the lane. can be done.
prior art documents
patent literature
[0005]
Patent document 1: JP 2015-162228 A
SUMMARY OF THE INVENTION
Problems to be Solved by the Invention
[0006]
 However, in Patent Document 1, as described in paragraph 0014, etc., the past travel position (travel trajectory) stored in the travel trajectory storage means of the car navigation system is used, so the accuracy of GNSS (Global Navigation Satellite System) In environments where the position estimation accuracy of the car navigation system is low, such as in low-power environments, the quality of the past driving position (driving trajectory) used as a reference is poor, and there is a problem that lane deviation cannot be determined appropriately. In addition, since the deviation is determined based on the average of multiple past driving positions, on multi-lane roads and intersections, the averaging process calculates an inappropriate reference driving position that the vehicle would not normally drive. Departure cannot be determined appropriately even with reference to this inappropriate reference travel position.
[0007]
 Therefore, the present invention provides a driving support device capable of realizing lane deviation warning and control even in environments where there are no lane markings and GNSS accuracy is low, roads with multiple lanes, and intersections. With the goal.
Means to solve problems
[0008]
 In one representative example of the present invention, a plurality of past travel positions of a vehicle, a recording unit that records position estimation information related to the past travel positions, and an output from a sensor that detects the surroundings of the vehicle. a relative position estimating unit for estimating the relative position of the vehicle, adding the history of the relative position to the past travel position, and adding the output of the sensor used for estimating the relative position to the position estimation information; a position estimating unit for estimating the position of the vehicle with respect to the past travel position from the output of and the information for position estimation in the recording unit, and adding the history of the position to the past travel position; and the plurality of past travels a deviation determination unit that determines deviation from the plurality of past travel positions by using a distribution of positions, the position of the vehicle with respect to the plurality of past travel positions, and a predetermined reference value; and and a control unit that controls the vehicle so as to cancel the warning or the deviation when the vehicle is driven.
Effect of the invention
[0009]
 According to the present invention, even in an environment where there are no lane markings and the accuracy of GNSS is low, a road with multiple lanes, or an intersection, it is possible to issue a warning for lane deviation and realize return control. be.
Brief description of the drawing
[0010]
[Fig. 1] A functional block diagram of a driving support device according to Embodiment 1
[Fig. 2] A diagram showing an example of position estimation information
recorded in a recording unit [Fig. 3] An example of vehicle/surrounding information recorded in a recording unit [Figure 4] Figure showing
an example of deviation determination on a three-lane road
[Figure 5] Figure showing an example of deviation determination at an intersection
[Figure 6] Departure determination when an exceptional past travel position is included Diagram showing an example
[Fig. 7] Diagram showing an example of deviation determination based on speed at an intersection
[Fig. 8] Diagram showing an example of deviation determination based on speed in front of a stop line
[Fig. 9] Speed ​​in a parking lot [Fig. 10] A diagram showing an
example of speed distribution
[Fig. 11] A diagram showing an example of departure judgment based on the lighting status of direction indicators
[Fig. 12] Distribution by grid map [Fig. 13] A diagram showing
a block configuration of the driving support system of the second embodiment
[Fig. 14] A diagram showing the flow of processing of the motion determination unit
[Fig. 15] A block configuration of the driving support device of the third embodiment [ Fig
. 16] A diagram showing an example of a screen presented by the user input unit
[Fig. 17] A diagram showing an example of deviation determination based on valid/invalid information
[Fig. 18] Diagram showing the system configuration of the driving support device of the fourth embodiment and the recorded data sharing device
[Fig. 19] Diagram showing the block configuration of the driving support device of the fifth embodiment
[Fig. 20] Deviation using the target trajectory [Fig. 21] A diagram showing an example of determination
[Fig. 21] A diagram showing a block configuration of the driving support device of the sixth embodiment
[Fig. 22] A diagram showing an example of processing in the similar position estimating unit
MODE FOR CARRYING OUT THE INVENTION
[0011]
 An embodiment of the driving support system of the present invention will be described below with reference to the drawings.
Example 1
[0012]
 A driving support device according to a first embodiment of the present invention will be described below with reference to FIGS. 1 to 12. FIG.
[0013]
 (Block Configuration)
 FIG. 1 is a functional block diagram of the driving support system 1 of this embodiment. As shown here, the driving assistance device 1 includes a relative position estimation unit 11, a recording unit 12, a position estimation unit 13, a departure determination unit 14, and a control unit 15. The relative position estimation unit Measurement data of the sensor 10 is input to the sensor 11 and the position estimation unit 13 . Specifically, the driving support device 1 is a computer including hardware such as an arithmetic unit such as a CPU, a main storage device such as a ROM, an auxiliary storage device such as a RAM, and a communication device. Each function described later is realized by executing the program loaded from the auxiliary storage device to the main storage device by the arithmetic unit. do.
[0014]
 The sensor 10 is mounted on a vehicle and is a sensor that measures the environment around the vehicle, such as a monocular camera, stereo camera, LiDAR, millimeter wave radar, sonar, etc. to measure In the case of using a monocular camera, the acquired data is an image, and the 3D position cannot be acquired directly. Position can be measured. In this embodiment, a stereo camera is used as the sensor 10 . In addition to 3D information, the stereo camera can detect information such as lanes and stop lines necessary for driving support from images. However, the sensor 10 is not limited to a stereo camera, and may be a combination of other sensors or multiple sensors such as a monocular camera and LiDAR. Further, as the sensor 10, in addition to a sensor that measures the environment around the vehicle, a sensor that measures the state of the vehicle may be used. For example, a GNSS, compass, or gyroscope that can measure the position and orientation of the vehicle may be used. A sensor that acquires information such as the position and orientation of the vehicle by communicating with a beacon installed on the road may also be used.
[0015]
 Next, after each configuration of the driving assistance device 1 is outlined, each configuration will be described in detail.
[0016]
 The relative position estimation unit 11 estimates the relative position of the vehicle in the lane in which the vehicle is traveling for the first time based on the measurement data of the sensor 10, and sets the estimated relative position as the first past travel position 12a0 in the lane . The measurement data of the sensor 10 used when estimating the past travel position 12a - 0 is recorded in the recording unit 12 as the first position estimation information 12b- 0 . Here, the relative position of the vehicle is information representing the relative position/posture with reference to the position/posture of the vehicle at a certain time.
[0017]
 The recording unit 12 records a plurality of past travel positions 12a, position estimation information 12b related to the past travel positions 12a, and vehicle/surrounding information 12c. The plurality of past travel positions 12a also include the first past travel position 12a0 , and the position estimation information 12b also includes the first position estimation information 12b0 .
[0018]
 Based on the measurement data of the sensor 10 and the position estimation information 12b recorded in the recording unit 12, the position estimation unit 13 calculates the current position and attitude of the vehicle relative to the past travel position 12a (hereinafter referred to as "current position P"). ), adds the history of the estimated current position P as the latest past travel position to the past travel position 12 a of the recording unit 12 , and outputs it to the deviation determination unit 14 . Therefore, the position estimating unit 13 basically does not operate in the lane where the vehicle is traveling for the first time, and operates only in the lane where the vehicle has traveled history.
[0019]
 The departure determination unit 14 uses the plurality of past travel positions 12a recorded in the recording unit 12, the current position P of the vehicle with respect to the plurality of past travel positions 12a, and a predetermined reference value Th to determine a plurality of Departure from the past travel position 12a is determined.
[0020]
 When the deviation determination unit 14 determines that the vehicle has deviated, the control unit 15 issues an alarm to the driver and controls the steering system and the acceleration/deceleration system of the vehicle so as to eliminate the deviation.
[0021]
 (Operation of relative position estimating section 11)
 Next, the process of estimating the relative position of the vehicle by the relative position estimating section 11 will be described in detail. As described above, the relative position estimating unit 11 estimates the relative position of the vehicle based on the measurement data of the sensor 10 in the lane in which the vehicle is traveling for the first time, and uses the estimated relative position as the first past traveling position 12a0 . , the measurement data of the sensor 10 used for the estimation is recorded in the recording unit 12 as the first position estimation information 12b0 .
[0022]
 In this way, the relative position estimation unit 11 operates when the past travel position 12a corresponding to the current environment is not included in the recording unit 12, that is, when the vehicle travels in the current environment for the first time. The relative position/orientation estimated by the relative position estimator 11 becomes the first past travel position 12a0 with respect to the environment (lane ) .
[0023]
 For example, when a monocular camera or a stereo camera is used as the sensor 10, the relative position estimation unit 11 extracts feature points and image feature amounts of the feature points from the images, and uses the image feature amounts to extract feature points between a plurality of images. , it is possible to use the SfM (Structure from Motion) method and the VSLAM (Visual Simultaneous Localization and Mapping) method, which are methods for estimating the relative position/orientation of the camera and the three-dimensional position of the feature point.
[0024]
 FIG. 2 is a diagram showing an example of position estimation information 12b recorded in the recording unit 12 when the sensor 10 is a camera. As shown here, the position estimation information 12b is information defined by a combination of the "three-dimensional position" and the "image feature amount" of the feature points estimated by the SfM method or the VSLAM method. The three-dimensional positions of the feature points are in the same coordinate system as the past travel positions estimated by the SfM method and the VSLAM method.
[0025]
 On the other hand, when the sensor 10 is a stereo camera or LiDAR, ICP-SLAM (Iterative Closest Point-Simultaneous Localization) estimates the position and orientation of the sensor by associating the three-dimensional positions output from the sensor 10 at multiple times. and Mapping) method can be used. In this case, the "image feature amount" exemplified in FIG. 2 is not essential, so the position estimation information 12b may be defined only by the "three-dimensional position".
[0026]
 (Operation of position estimating section 13)
 Next, details of vehicle position estimating processing by the position estimating section 13 will be described. As described above, the position estimation unit 13 estimates the current position P of the vehicle relative to the past travel position 12a based on the measurement data of the sensor 10 and the position estimation information 12b of the recording unit 12, and the history of the current position P is added to the past travel position 12 a of the recording unit 12 as the latest past travel position, and is output to the deviation determination unit 14 .
[0027]
 For example, when a monocular camera or a stereo camera is used as the sensor 10, the position estimation unit 13 can estimate the position and orientation from the correspondence between the two-dimensional and three-dimensional positions of the feature points. In this case, first, the feature points and the image feature amounts of the feature points are extracted from the image currently being captured by the camera, and are associated with the image feature amounts included in the position estimation information 12b to obtain the feature points in the current image. A plurality of correspondences between two-dimensional positions and three-dimensional positions included in the position estimation information 12b are obtained. Next, the three-dimensional position/orientation of the camera is estimated by using a known PnP (Perspective n Point) problem solving method for estimating the position/orientation of the camera from the correspondence between the two-dimensional and three-dimensional positions.
[0028]
 On the other hand, when the sensor 10 is a stereo camera or LiDAR, an ICP (Iterative Closest Point) method can be used to estimate the position/orientation between the 3D point groups by associating the 3D point groups.
[0029]
 Here, since the three-dimensional position included in the position estimation information 12b is in the same coordinate system as the past running position 12a, the estimated three-dimensional position/orientation of the camera is in the same coordinate system as the past running position 12a. It is possible to obtain the current position P with respect to the past travel position 12a.
[0030]
 (Operation of Recording Unit 12)
 Next, details of recording processing by the recording unit 12 will be described. As described above, the recording unit 12 records a plurality of past travel positions 12a, position estimation information 12b, and vehicle/surrounding information 12c.
[0031]
 The vehicle/surrounding information 12c is information obtained by grouping the past travel positions estimated by the relative position estimating unit 11 and the position estimating unit 13 according to the combination of the vehicle condition and the surrounding condition during traveling. is. In this example, past travel positions 12a with travel position IDs of 1, 5, etc., are classified into groups defined by the vehicle conditions and surrounding conditions in the first row, and travel position IDs of 2, 3, etc. It is shown that the past travel positions 12a are classified into groups defined by the vehicle conditions and surrounding conditions on the second line. The vehicle condition is, for example, the driver, the type of tires, and the number of passengers, and is estimated from the input from the user and the output of sensors installed in the suspension, seat, seat belt, and the like. Surrounding conditions are, for example, the presence or absence of a preceding vehicle, signal conditions, and weather, which are estimated from the output of the sensor 10 and the like.
[0032]
 In addition, the recording unit 12 may also record the vehicle conditions such as the vehicle speed and the lighting condition of the direction indicator at each position of the past travel position 12a. These vehicle conditions can be acquired from CAN (Controller Area Network).
[0033]
 (Operation of Departure Determination Unit 14)
 Next, the details of the departure determination process executed in real time by the departure determination unit 14 will be described with reference to FIGS. 4 to 11. FIG. The deviation determination unit 14 uses the plurality of past travel positions 12a recorded in the recording unit 12, the current position P of the vehicle estimated by the position estimation unit 13, and a predetermined reference value Th to determine a plurality of past travel positions 12a. Departure from the running position is determined in real time.
[0034]
 First, the deviation determination unit 14 refers to the vehicle/surrounding information 12c recorded in the recording unit 12, and confirms the travel position ID belonging to the same group as the current vehicle condition and surrounding condition. Next, using a plurality of past travel positions 12a corresponding to the travel position ID, the current position P of the vehicle estimated by the position estimator 13, and a predetermined reference value Th, a plurality of selected past Departure from the travel position 12a is determined. Departure determination processing performed by the deviation determination unit 14 will be described below for each specific road environment.
[0035]
 FIG. 4 is a diagram showing an example of deviation determination on a three-lane road divided by lane markings. Here, the lane marking L 1 that can be detected by the sensor 10 is indicated by a solid line, and the lane marking L 1 ′ that cannot be detected by the sensor 10 due to blurring or the like is indicated by a broken line. FIG . 5 is a diagram showing an example of deviation determination at an intersection where a stop line L2 exists in addition to the lane marking L1 .
[0036]
 The deviation determination unit 14 determines deviation from the past travel position 12a belonging to the same group as the current vehicle condition and the surrounding conditions, and the current position P estimated by the position estimation unit 13 .
[0037]
 First, the deviation determination unit 14 sets a straight line X in a direction perpendicular to the traveling direction of the vehicle, with the current position P as a reference.
[0038]
 Next, the deviation determination unit 14 applies a distribution to the coordinates of the points where the straight line X and each past travel position 12a intersect. It should be noted that in the examples of FIGS. 4 and 5, the past travel positions 12a are concentrated in the vicinity of any of the ideal travel positions. By using, for example, a mixed normal distribution as the distribution for these past travel positions 12a, a multimodal distribution 12d having three peaks can be obtained in the example of FIG. 4, and two peaks in the example of FIG. A multimodal distribution 12d with two peaks can be obtained. Here, the value of the distribution 12d at each point on the straight line represents the existence probability of the vehicle at that point. However, the distribution 12d is not limited to the mixed normal distribution, and other probability distribution models may be used.
[0039]
 Finally, the departure determination unit 14 compares the probability of vehicle existence at the current position P obtained from the distribution 12d with a predetermined reference value Th, and if the probability of existence at the current position P is lower than the reference value Th, is judged to be a deviation. However, even if the probability of existence at the current position P is lower than the reference value Th, the departure determination unit 14 determines that the driver intentionally performs a departure operation when the direction indicator or the hazard lamp is on. It is determined that the
[0040]
 By repeating such processing while the vehicle is traveling, the deviation determination unit 14 can constantly monitor deviation from the lane.
[0041]
 FIG. 6 shows an example of departure determination when an exceptional past travel position 12a' when avoiding a parked vehicle is included in the past travel positions 12a belonging to the same group as the current vehicle situation and the surrounding situation. It is a figure which shows. Since the number of exceptional past travel positions 12a′ is extremely small compared to other normal past travel positions 12a, in the distribution 12d obtained by the departure determination unit 14, there are no positions corresponding to the past travel positions 12a′. Although a very small peak is formed, its probability is small. Therefore, in the deviation determination based on the distribution 12d, the deviation determination unit 14 is not greatly affected by the exceptional past travel positions 12a', and based on the majority of normal past travel positions 12a, the deviation from the lane is detected. can be determined correctly. In FIG. 6, the distribution 12d is obtained in consideration of the exceptional past travel positions 12a'. The positions may be deleted from the recording unit 12, and the distribution 12d may be obtained using only past travel positions 12a that are considered to be normal.
[0042]
 In addition to the deviation determination based on the current position P, the deviation determination unit 14 may also perform deviation determination based on vehicle conditions. Vehicle conditions that can be used for this deviation determination are, for example, the speed and lighting conditions of direction indicators. Hereinafter, details of deviation determination processing based on vehicle information by the deviation determination unit 14 will be described for each specific environment.
[0043]
 FIG. 7 is a diagram showing an example of departure determination based on speed at an intersection. FIG . 8 is a diagram showing an example of departure determination based on the speed before the stop line L2 . FIG. 9 is a diagram showing an example of departure determination based on speed in a parking lot.
[0044]
 In the departure determination based on the speed illustrated in FIGS. 7 to 9, the departure determination unit 14 first selects a set G of points on the past travel position 12a close to the current position P. FIG. Next, the departure determination unit 14 acquires the recorded velocity V associated with each point included in the set G, fits a mixed normal distribution, and obtains the distribution 12d V of velocity. FIG. 10 is a diagram showing an example of the velocity distribution 12dV .
The departure determination unit 14 compares the existence probability of the current speed v obtained from the speed distribution 12d V with a predetermined reference value Th, and if the existence probability of the current speed v is lower than the reference value Th, for example, standard If the speed is extremely high compared to normal speed, it is determined that the speed is deviating from the normal speed.
[0045]
 FIG. 11 is a diagram showing an example of departure determination based on the lighting status of turn signals at an intersection. In the deviation determination based on the lighting status of the direction indicator, the deviation determination unit 14 selects the position PL at which the direction indicator is turned on or off at the past travel position 12a . Here, when the current direction indicator is in the lighting state, the extinguishing position is selected as the position PL , and when the current direction indicator is in the extinguishing state, the lighting position is selected as the position PL . Next, a straight line Y is set from the current position P in the traveling direction. Then, a plurality of positions P L are projected in the direction perpendicular to the straight line Y, and a mixed normal distribution is applied to the positions on the straight line Y to obtain the distribution 12d L regarding the lighting position or the turning off position of the direction indicator . The departure determination unit 14 compares the existence probability of the current position P obtained from the distribution 12dL relating to the direction indicators with a predetermined reference value Th, and if the existence probability of the current position P is lower than the reference value Th, , is determined to be a deviation from the normal lighting position or light-out position.
[0046]
 (Operation of Control Unit 15)
 Next, the contents of vehicle control processing by the control unit 15 will be described. When the deviation determination unit 14 determines the position, speed, turning on/off of the direction indicator, etc., the control unit 15 issues an alarm to the driver or controls the vehicle to eliminate the deviation.
[0047]
 When issuing a warning to the driver, the control unit 15 notifies the driver of the departure by sound, navigation system screen, steering vibration, seat vibration, or other methods. Further, when controlling the vehicle, the control unit 15 controls the steering, brake, accelerator, direction indicator, etc. so as to eliminate the deviation. A specific description will be given of the control for resolving the deviation.
[0048]
 For example, in FIGS. 4 and 5, it is determined that the position is deviated by the deviation determination. In this case, the control unit 15 first obtains a position on the straight line X that is closest to the current position P and whose existence probability exceeds a predetermined reference value Th. Alternatively, the control unit 15 obtains the position closest to the current position P where the existence probability is the maximum value. Next, the control unit 15 controls the steering, brake, and accelerator so that the vehicle moves from the current position P to a position exceeding the determined reference value Th or in the direction of the determined maximum value.
[0049]
 Further, for example, in FIG. 10, it is determined that there is a speed deviation by the deviation determination. In this case, the control unit 15 first obtains the speed closest to the current speed v, the speed exceeding the predetermined reference value Th. Alternatively, the control unit 15 obtains the speed that is closest to the current speed v and that has the maximum existence probability. Next, the control unit 15 controls the brake or accelerator so that the vehicle accelerates or decelerates from the current speed v to a speed that exceeds the determined reference value Th or reaches the determined maximum value.
[0050]
 Further, for example, in FIG. 11, it is determined that the lighting status of the direction indicator is deviated by the deviation determination. In this case, the control unit 15 controls the direction indicator so that the lighting state of the selected direction indicator is the same as the lighting state of the direction indicator at the lighting or extinguishing position PL .
[0051]
 (Effects)
 According to the first embodiment described above, the following effects are obtained.
[0052]
 (1) By using the position estimation information recorded in the recording unit, the position estimation unit can highly accurately estimate the current position relative to the past travel position even in an environment with low accuracy of GNSS. The unit can determine deviations from past runs and implement warnings and controls.
[0053]
 (2) Even if the past travel positions include an unusual travel position such as avoidance of a parked vehicle, deviation can be determined by suppressing the influence of the unusual travel position.
[0054]
 (3) Departure can be determined correctly even in an environment with multiple lanes.
[0055]
 (4) A warning or control can be implemented when the vehicle condition differs from past travel.
[0056]
 (5) Warnings or warnings for excessive speed at intersections (Fig. 7), failure to stop at stop lines (Fig. 8), sudden acceleration due to wrong accelerator or brake pedals in parking lots (Fig. 9), etc. Control can be implemented.
[0057]
 (6) It is possible to issue a warning or control when a direction indicator is forgotten to be turned on or turned off at an intersection or the like.
[0058]
 (7) A change in the running position due to a change in vehicle conditions or surrounding conditions is no longer determined as deviation.
[0059]
 (8) The controller can effectively eliminate the deviation according to the mode of the detected deviation.
[0060]
 (Modification of Embodiment 1)
 The deviation determination unit 14 determines deviation based on the distribution 12d calculated in real time with the current position P as a reference. However, the distribution calculation method is not limited to this. For example, a pre-calculated distribution may be recorded in the recording unit 12, and deviation may be determined using this distribution. Specifically, the space corresponding to the road environment may be partitioned into grid-like grid maps, and a value corresponding to the existence probability may be stored in each grid to obtain the distribution.
[0061]
 FIG. 12 is a diagram showing an example of distribution by the grid map 12e. In this example, a value is assigned to each grid according to the distance from the past travel position 12a. The same processing is performed for each past travel position 12a, and a plurality of obtained grid maps 12e are added to form a distribution for a plurality of past travel positions 12a.
[0062]
 In this case, the departure determination unit 14 determines that the vehicle has departed when the grid value corresponding to the current position P is smaller than a predetermined reference value Th. Then, the control unit 15 controls the steering, the brake, and the accelerator so that the vehicle moves to a position where the grid value increases with the current position P as a reference.
[0063]
 When the deviation is determined using the velocity, the velocity distribution 12d V is calculated in advance and stored in each grid of the grid map 12e by the same processing as the deviation determination unit 14. FIG. Furthermore, if the departure is determined using the lighting status of the direction indicator, a value is set in each grid according to the distance from the lighting/extinguishing position of the direction indicator.
[0064]
 According to the modified example of the first embodiment described above, the following effects are obtained. That is, by calculating the distribution in advance, the calculation load of real-time processing while the vehicle is running can be reduced, and the deviation determination processing can be executed at a higher speed.
Example 2
[0065]
 Next, Embodiment 2 of the driving support system of the present invention will be described with reference to FIGS. 13 and 14. FIG. In the following description, the same reference numerals are given to the same constituent elements as in the first embodiment, and differences will be mainly described. The points that are not particularly described are the same as those in the first embodiment.
[0066]
 In addition to the driving support system 1 of the first embodiment, the driving support system 2 of the present embodiment includes an LDW device 22 (Lane Departure Warning device) that performs warning and control based on lane markings, and warning and control based on the road edge. It has an RDW device 23 (Road Departure Warning device) that performs control, and an operation determination unit 21 that determines which one of them is to be used.
[0067]
 (Block Configuration)
 FIG. 13 is a diagram showing a block configuration of the driving support system 2. As shown in FIG. As shown here, the driving assistance system 2 includes a motion determination unit 21 , an LDW device 22 , an RDW device 23 and a driving assistance device 1 . The operation determination unit 21 determines which of the LDW device 22, the RDW device 23, and the driving support device 1 to operate based on the output of the sensor 10. FIG. The LDW device 22 performs departure determination and warning/control based on lane markings. The RDW device 23 performs deviation determination and warning/control based on the road edge. A known technique is used for the LDW device 22 and the RDW device 23 .
[0068]
 (Operation of Motion Determining Unit)
 Next, the contents of processing in the motion determining unit 21 will be described with reference to FIG. The operation determination unit 21 determines which of the LDW device 22, the RDW device 23, and the driving support device 1 to operate based on the output of the sensor 10. FIG.
[0069]
 First, in step S1 , the motion determination unit 21 determines from the output of the sensor 10 whether lane markings have been detected. If detected, the process proceeds to step S2, and if not detected, the process proceeds to step S3.
[0070]
 In step S2 , the operation determination unit 21 sets the operation device to the LDW device 22 and performs warning/control based on the lane marking detected by the sensor 10 .
[0071]
 In step S3, the motion determination unit 21 determines from the output of the sensor 10 whether the road edge has been detected. If detected, the process proceeds to step S4, and if not detected, the process proceeds to step S5.
[0072]
 In step S4 , the motion determination unit 21 sets the motion device to the RDW device 23 and causes warning/control to be performed based on the road edge detected by the sensor 10 .
[0073]
 In step S5, the motion determination unit 21 sets the operation device to the driving support device 1, and causes the sensor 10 to perform warning and control in a situation where neither the lane marking nor the road edge can be detected.
[0074]
 (Effects)
 According to the second embodiment described above, the following effects are obtained.
[0075]
 (1) Departure determination and warning/control can be performed appropriately according to the environment.
[0076]
 (2) In an environment where lane markings and roadsides are detected, the past travel position 12a and the position estimation information 12b are not recorded in the recording unit 12, so the usage of the storage area can be reduced.
[0077]
 (Modification of Embodiment 2)
 In the above-described motion determination unit 21, motion devices are set based on the results of detection of lane markings and road edges from the output of the sensor 10. FIG. However, the setting method of the operating device is not limited to this.
[0078]
 For example, a map and a sensor such as GNSS for estimating the position of the vehicle on the map may be additionally provided, and the motion determination unit 21 may set the motion device based on the position on the map. Specifically, warnings based on lane markings and road edges in locations where it is known in advance that it is difficult to detect lane markings and road edges, such as in intersections, and on alternating roads where lane markings do not exist. - For places where it is known in advance that control is difficult, the driving support device 1 that performs warning and control based on the past travel position is set as the operating device. In addition, the LDW device 22 is set as the operating device for locations where there is a high possibility that warning/control based on lane markings will operate, such as highways and major national roads.
[0079]
 According to the modified example of the second embodiment described above, the following effects are obtained. In other words, it is possible to set an operating device suitable for each environment without being affected by the time taken to detect lane markings and road edges from the output of sensors and by erroneous detection.
Example 3
[0080]
 Next, with reference to FIGS. 15 to 17, a driving support system according to a third embodiment of the invention will be described. In the following description, the same reference numerals are given to the same constituent elements as in the first embodiment, and differences will be mainly described. The points that are not particularly described are the same as those in the first embodiment.
[0081]
 A driving support system 3 of the present embodiment is the same as the driving support system 1 of the first embodiment, with the addition of a user input unit that receives valid/invalid information 12f for the deviation determination result input by the user.
[0082]
 (Block Configuration)
 FIG. 15 is a diagram showing a block configuration of the driving support system 3. As shown in FIG. As shown here, the driving support device 3 includes a relative position estimation unit 11 , a recording unit 12 , a position estimation unit 13 , a departure determination unit 14 , a control unit 15 and a user input unit 31 .
[0083]
 The user input unit 31 receives validity/invalidity information 12f for the departure determination result input by the user. Then, the recording unit 12 of this embodiment records the valid/invalid information 12f received by the user input unit 31 in addition to the information equivalent to that of the first embodiment. The deviation determination unit 14 of this embodiment determines deviation in consideration of the validity/invalidity information 12f.
[0084]
 (Operation of User Input Unit)
 Next, the contents of processing in the user input unit 31 will be described with reference to FIG. The user input unit 31 accepts valid/invalid information 12f for the deviation determination result from the user's input.
[0085]
 FIG. 16 is a diagram showing an example of a screen 31a presented by the user input unit 31. As shown in FIG. The user input unit 31 uses the screen of the navigation system or the like to present the user with the date and time when the deviation determination unit 14 determines that the vehicle has deviated. Also, the location may be displayed using a map of the navigation system. The user inputs whether the deviation is valid if the determination is correct or invalid if the determination is incorrect, by touching the screen or operating buttons. As a result, the location where the deviation determination unit 14 made an erroneous deviation determination is specified.
[0086]
 (Operation of Departure Determination Unit)
 Next, details of departure determination processing in the departure determination unit 14 of this embodiment will be described with reference to FIG. The departure determination unit 14 of the present embodiment determines whether or not to make a departure determination based on the valid/invalid information 12f prior to departure determination processing equivalent to that of the first embodiment.
[0087]
 FIG. 17 is a diagram showing an example of a situation in which it is determined not to determine deviation based on the validity/invalidity information 12f. In this example, a deviation was determined at the position 31b in the past, but the user has input that the deviation determination was invalid. is included, deviation determination is not performed. On the other hand, when the current position P is not included in the range 31c, the process of the departure determination unit 14 is performed.
[0088]
 (Effects)
 According to the third embodiment described above, the following effects can be obtained. That is, since no deviation determination is performed within a certain distance from the invalidated past deviation position, erroneous deviation determination is not repeated.
Example 4
[0089]
 Next, Embodiment 4 of the driving support system of the present invention will be described with reference to FIG. In addition, below, the same code|symbol is attached|subjected to the same component as Example 1, and difference is mainly demonstrated. The points that are not particularly described are the same as those in the first embodiment.
[0090]
 A driving support device 4 of this embodiment has a configuration corresponding to the driving support device 1 of the first embodiment, and a data transmitting/receiving unit 4a for transmitting and receiving data. By forming a system consisting of, the information acquired by the own vehicle can be developed to other vehicles, and the own vehicle can make a deviation judgment based on the information acquired by the other vehicle.
[0091]
 (Block Configuration)
 FIG. 18 is a diagram showing a block configuration of the driving support device 4 of this embodiment and the data sharing device 40 that mediates between the driving support devices.
[0092]
 As shown here, the driving assistance device 4 includes the driving assistance device 1 of the first embodiment and a data transmitter/receiver 4a.
[0093]
 The data sharing device 40 also includes a data transmission/reception section 40a and a shared recording section 40b. Two or more driving support devices 4 are connected to the data sharing device 40 via a network 41 such as a mobile phone network. The driving support devices 4 are mounted on different vehicles.
The data sharing device 40 is installed in a server, for example.
[0094]
 The data transmission/reception unit 4 a of the driving assistance device 4 transmits the data recorded by the recording unit 12 of the driving assistance device 1 to the data sharing device 40 via the network 41 . Also, the data received from the data sharing device 40 is recorded as data in the recording unit 12 of the driving support device 1 .
[0095]
 The data transmission/reception unit 40a of the data sharing device 40 outputs the data received from the driving support device 4 to the shared recording unit 40b. Also, the data recorded by the shared recording unit 40b is transmitted to the driving support device 4. FIG.
[0096]
 (Operation of shared recording unit)
 Next, the details of processing in the shared recording unit 40b will be described. The shared recording unit 40b integrates and records the stored data received from the plurality of driving support devices 4. FIG.
[0097]
 First, like the recording unit 12, the shared recording unit 40b groups past travel positions based on the vehicle/surrounding information 12c. Here, in the shared recording unit 40b, in addition to the vehicle status used by the recording unit 12, the vehicle type may be used as the vehicle status. Also, instead of identifying the driver personally, attributes of the driver, such as the age of the driver, may be used. Subsequent processing is performed for each group.
[0098]
 Next, the shared recording unit 40b unifies the coordinate system of the stored data received from the plurality of driving support devices 4. FIG. Since the data stored in each driving support device 4 is recorded in its own coordinate system, it is necessary to unify the coordinate system in order to integrate a plurality of recorded data. Matching of the three-dimensional position included in the position estimation information 12b of each recording data can be used to unify the coordinate system. For example, an ICP (Iterative Closest Point) method of estimating the position/orientation between three-dimensional point groups by associating the three-dimensional point groups can be used.
[0099]
 Next, the shared recording unit 40b selects a past travel position to be transmitted to the driving support device 4 from a large number of past travel positions with a unified coordinate system. For example, similar to the process of the departure determination unit 14, the current position P is virtually set, and the distribution is applied to all past travel positions. Then, when a predetermined number of past travel positions are selected and the distribution is applied, the past travel positions are selected so that the shape of the distribution has the smallest difference from the distribution applied to all the past travel positions. . As a result, using a small number of past travel positions, it is possible to obtain the same distribution as when all past travel positions are used.
[0100]
 (Effects)
 According to the fourth embodiment described above, the following effects are obtained.
[0101]
 (1) Departure can be determined with higher accuracy by using a large number of past travel positions acquired from a large number of vehicles. In addition, even in a place where the vehicle has never traveled in the past, deviation can be determined and warnings and controls can be implemented.
[0102]
 (2) By using a small number of past travel positions to express a distribution similar to a distribution consisting of a large number of data, it is possible to reduce the amount of data communicated via the network and the storage capacity of the driving support device. .
Example 5
[0103]
 Next, Embodiment 5 of the driving support system of the present invention will be described with reference to FIGS. 19 and 20. FIG. In addition, below, the same code|symbol is attached|subjected to the same component as Example 1, and difference is mainly demonstrated. The points that are not particularly described are the same as those in the first embodiment.
[0104]
 A driving assistance device 5 of the present embodiment is obtained by adding a trajectory planning unit 51 that plans the trajectory of the vehicle from the output of the sensor 10 to the driving assistance device 1 of the first embodiment.
[0105]
 (Block Configuration)
 FIG. 19 is a diagram showing a block configuration of the driving support device 5. As shown in FIG. As shown here, the driving assistance device 5 includes a relative position estimation unit 11 , a recording unit 12 , a position estimation unit 13 , a deviation determination unit 14 , a control unit 15 and a trajectory planning unit 51 . The trajectory planning unit 51 plans the target trajectory that the vehicle should travel on from the output of the sensor 10. In addition to the output of the sensor 10, the trajectory planning unit 51 plans the target trajectory using a map, a destination input by the user, and the like. You can The deviation determination unit 14 of this embodiment determines deviation using the trajectory planned by the trajectory planning unit 51 in addition to the determination method in the first embodiment.
[0106]
 (Operation of Departure Judgment Unit)
 Next, the details of processing in the departure judgment unit 14 of this embodiment will be described with reference to FIG.
[0107]
 FIG. 20 is a diagram showing an example of departure determination using a target trajectory. In this embodiment, the target trajectory 51a planned by the trajectory planning section 51 is input to the deviation determination section 14. FIG. Therefore, the departure determination unit 14 considers the target trajectory 51a in creating the distribution 12d. Specifically, the deviation determination unit 14 first selects a past travel position 12a within a certain range from the target trajectory 51a. As a result, in the example of FIG. 20, only the past travel position 12a traveling in the upper lane in the figure is selected, and the past travel position 12a traveling in the lower lane in the figure is excluded. Next, using the selected past travel position 12a and target trajectory 51a, the distribution 12d is applied in the same manner as the departure determination unit 14. FIG. Here, in applying the distribution 12d, weights may be added to the past travel position 12a and the target trajectory 51a. For example, by increasing the weight for the target trajectory 51a, the influence of the target trajectory 51a on the distribution 12d may be increased.
[0108]
 Next, the deviation determination unit 14 determines deviation by the same processing as in the first embodiment based on the fitted distribution 12d.
[0109]
 The target trajectory 51a similarly planned by the trajectory planning unit 51 may also be used in deviation determination based on the speed and the lighting status of the direction indicator. Specifically, the trajectory planning unit 51 plans the target speed and the lighting conditions of the target direction indicators for the target trajectory 51a, and the deviation determination unit 14 calculates the distribution 12d using the target speed and the lighting conditions of the target direction indicators. create.
[0110]
 (Effects)
 According to the fifth embodiment described above, the following effects are obtained.
[0111]
 (1) The number of past travel positions used for fitting the distribution is reduced, and the departure determination process is speeded up. Also, a movement that differs from the target trajectory can be determined as deviation. Furthermore, if the past travel positions include an unusual travel position such as a travel position when avoiding a parked vehicle, deviation can be determined by excluding the unusual travel position.
[0112]
 (2) By using the past traveling position in addition to the target trajectory, deviation from the target trajectory according to the position and vehicle/periphery information can be taken into account to determine deviation.
Example 6
[0113]
 Next, with reference to FIGS. 21 and 22, a sixth embodiment of the driving support system of the present invention will be described. In addition, below, the same code|symbol is attached|subjected to the same component as Example 1, and difference is mainly demonstrated. The points that are not particularly described are the same as those in the first embodiment.
[0114]
 The driving support device 6 of the present embodiment uses the output of the sensor 10, the output of the relative position estimator 11, and the plurality of past travel positions 12a and the position estimation information 12b recorded in the recording unit 12 for the first time to travel. A similar position estimator 61 is provided for replacing the relative position in the surrounding environment with a position relative to the similar past travel position 12a.
[0115]
 (Block Configuration)
 FIG. 21 is a diagram showing a block configuration of the driving support device 6. As shown in FIG. As shown here, the driving support device 6 includes a relative position estimation unit 11, a recording unit 12, a position estimation unit 13, a departure determination unit 14, a control unit 15, and a similar position estimation unit 61. .
[0116]
 (Operation of similar position estimating section 61)
 Next, details of processing by the similar position estimating section 61 will be described. The similar position estimating unit 61 uses the output of the sensor 10, the output of the relative position estimating unit 11, and the plurality of past travel positions 12a and the position estimation information 12b recorded in the recording unit 12 to determine the environment in which the user first traveled. Replace the relative position with a position relative to the similar past travel position 12a.
[0117]
 FIG. 22 is a diagram showing an example of processing by the similar position estimating unit 61. The environment E 1 around the current position P (left figure) and the environment E around the past travel position 12a recorded in the recording unit 12 are shown in FIG. 2 (right). Comparing the two environments, there is a difference that you cannot go straight from your current position in environment E1 , and you cannot turn left from your current position in environment E2. It is a similar environment in that it has to pause at L2 .
[0118]
In  this way, when the past travel position 12a when traveling in the past environment E2 similar to the current environment E1 is recorded in the recording unit 12, the similar position estimation unit 61 calculates the current environment E1 Replace the current position P of the vehicle in E2 with the position P' in the past environment E2. Specifically, first, the similar position estimation unit 61 detects similar objects that exist in both the current environment E1 and the past environment E2 . Examples of similar objects include road markings such as lane markings L1 and stop lines L2 detected from the output of the sensor 10, three - dimensional objects such as buildings O1 and trees O2 , and objects obtained from the relative position estimation unit 11. and the position estimation information 12b. In the example of FIG . 22, stop line L2 is detected as a similar object. Next, the similar position estimator 61 adds the current position P of the vehicle with respect to the similar object in the current environment E1 to the position of the similar object in the past environment E2, thereby obtaining the position of the similar object in the past environment E2 .Calculate the position P′ in . Finally, the similar position estimation unit 61 outputs the position P′ with respect to the past travel position to the deviation determination unit 14 . As a result, the deviation determination unit 14 can perform the deviation determination assuming that the vehicle is traveling at the position P' in the past environment E2 .
[0119]
 (Effects)
 According to the sixth embodiment described above, the following effects are obtained. That is, even in a place where the own vehicle has never traveled in the past, it is possible to determine the deviation using past travel positions in a similar environment, and to implement warning and control.
[0120]
 In addition, the present invention is not limited to the above-described embodiments, and includes various modifications. For example, the above-described embodiments have been described in detail in order to explain the present invention in an easy-to-understand manner, and are not necessarily limited to those having all the described configurations. Other aspects conceivable within the scope of the technical idea of ​​the present invention are also included in the scope of the present invention. In addition, it is possible to replace part of the configuration of one embodiment with the configuration of another embodiment, and it is also possible to add the configuration of another embodiment to the configuration of one embodiment. Moreover, it is possible to add, delete, or replace a part of the configuration of each embodiment with another configuration. Further, each of the above configurations, functions, processing units, processing means, and the like may be realized by hardware, for example, by designing a part or all of them using an integrated circuit. Moreover, each of the above configurations, functions, etc. may be realized by software by a processor interpreting and executing a program for realizing each function. Information such as programs, tables, and files that implement each function can be stored in recording devices such as memories, hard disks, SSDs (Solid State Drives), or recording media such as IC cards, SD cards, and DVDs.
Code explanation
[0121]
Reference Signs List 1, 3 to 6 Driving support device
2 Driving support system
10 Sensor
11 Relative position estimator
12 Recording unit
 12a, 12a' Past travel position
 12b Position estimation information
 12c Vehicle/peripheral information
 12d, 12d V , 12d L distribution
 12e grid map
 12f valid/invalid information
13 position estimation unit
14 departure determination unit
15 control unit
21 operation determination unit
22... LDW device,
23... RDW device,
31... User input unit,
4a... Data transmitter/receiver,
40... Data sharing device,
40a... Data transmitter/receiver,
40b... Shared recording unit,
51... Trajectory planning unit,
61... Similar position estimation part
The scope of the claims
[Claim 1]
 A recording unit that records a plurality of past travel positions of the vehicle and position estimation information related to the past travel positions;
 a relative position of the vehicle that is estimated from an output of a sensor that detects the surroundings of the vehicle; a relative position estimating unit that adds a history to the past travel position and an output of the sensor used for estimating the relative position to the position estimating information;
 an output of the sensor; and the position estimating information in the recording unit. a position estimating unit for estimating the position of the vehicle with respect to the past travel positions from the above, and adding the history of the position to
 the past travel positions; a distribution of the plurality of past travel positions; a
 deviation determination unit that determines deviation from the plurality of past travel positions using the position of the vehicle and a predetermined reference value; and a control unit that controls a vehicle
 .
[Claim 2]
 2. The driving support system according to claim 1,
 wherein said deviation determination unit uses a multimodal distribution as said distribution.
[Claim 3]
 2. The driving support device according to claim 1,
 wherein the recording unit further records the vehicle condition of the vehicle, and the
 deviation determination unit combines the vehicle condition recorded in the recording unit with the vehicle condition obtained during actual driving. A driving support device characterized by using a situation to determine deviation.
[Claim 4]
 4. The driving support device according to claim 3,
 wherein the deviation determination unit determines the deviation of the speed of the vehicle using the speed, which is one type of the vehicle condition, and the speed obtained during actual running. driving support device.
[Claim 5]
 4. The driving support device according to claim 3,
 wherein the departure determination unit uses the lighting status of the direction indicators, which is one type of the vehicle status, and the lighting status of the direction indicators obtained during actual driving to determine the direction. A driving support device, characterized in that it determines deviation of the lighting status of a device.
[Claim 6]
 2. The driving assistance device according to claim 1,
 wherein the recording unit includes vehicle/surrounding information including at least one of the driver of the vehicle, tires, number of passengers, presence/absence of a preceding vehicle, traffic signal lighting status, and weather. is further recorded, and the
 deviation determination unit compares the vehicle/surrounding information in the recording unit with the vehicle/surrounding information obtained during actual driving, and determines the plurality of past driving positions under the same conditions. A driving support device that selects and determines deviation based on the plurality of selected past travel positions.
[Claim 7]
 2. The driving support device according to claim 1,
 wherein the recording unit records the distribution of the plurality of past travel positions calculated in advance, and the
 deviation determination unit comprises the distribution of the recording unit, A driving assistance device that determines a deviation from the plurality of past travel positions by using the position of the vehicle with respect to the plurality of past travel positions and a predetermined reference value.
[Claim 8]
 A driving support device according to claim 1,
 an LDW device that determines deviation based on lane markings , warns or controls the vehicle, and determines deviation based on the
 roadside, warns or controls the vehicle. A driving support system, comprising: an RDW
 device; and an operation determination unit that determines which of the LDW device, the RDW device, and the driving support device is
 to be operated.
[Claim 9]
 9. The driving assistance system according to claim 8,
 wherein the operation determination unit operates the driving assistance device when lane markings and road edges are not detected from the output of the sensor. system.
[Claim 10]
 9. The driving support system according to claim 8,
 further comprising a sensor for estimating the position of the vehicle on the map,
 wherein the motion determination unit determines the position of the vehicle on the map by determining the position of the vehicle on the map. A driving support system characterized by operating the
[Claim 11]
 2. The driving support device according to claim 1,
 further comprising a user input unit that receives information from a user indicating whether the position determined to be the deviation is valid or invalid,
 wherein the deviation determination unit A driving support device characterized in that when the position of the vehicle is included within a certain distance from the determined position, the deviation determination is not performed.
[Claim 12]
 2. A driving support system in which the driving support device according to claim 1 and a data sharing device are connected by a network,
 wherein the data sharing device integrates data received from a plurality of the driving support devices and records shared data. A driving assistance system, comprising: a shared recording unit that records data and provides the shared recording data to each of the driving assistance devices.
[Claim 13]
 13. The driving support system according to claim 12,
 wherein the shared recording unit stores data such that a difference in shape between a distribution applied to all past travel positions and a distribution applied to a selected past travel position is minimized. (2) a driving support system that selects a past driving position and provides it to each of the driving support devices;
[Claim 14]
 2. The driving support system according to claim 1,
 further comprising a trajectory planning unit that plans a target trajectory of the vehicle from the output of the sensor,
 wherein the deviation determination unit determines the past travel position within a certain range from the target trajectory. A driving support device that determines deviation using a distribution.
[Claim 15]
 15. The driving support system according to claim 14,
 wherein said deviation determination unit weights and applies said distribution to said past travel position and said target trajectory.
[Claim 16]
 2. The driving support device according to claim 1,
 wherein a current position is determined from the output of the sensor, the output of the relative position estimating section, the plurality of past travel positions recorded in the recording section, and the information for position estimation. and a similar position estimator that replaces the relative position in the environment with a similar position relative to the past travel position.

Documents

Application Documents

# Name Date
1 202217032579-Correspondence-091122.pdf 2022-12-08
1 202217032579-IntimationOfGrant03-12-2024.pdf 2024-12-03
1 202217032579.pdf 2022-06-07
2 202217032579-GPA-091122.pdf 2022-12-08
2 202217032579-PatentCertificate03-12-2024.pdf 2024-12-03
2 202217032579-TRANSLATIOIN OF PRIOIRTY DOCUMENTS ETC. [07-06-2022(online)].pdf 2022-06-07
3 202217032579-ABSTRACT [06-12-2022(online)].pdf 2022-12-06
3 202217032579-Correspondence-091122.pdf 2022-12-08
3 202217032579-STATEMENT OF UNDERTAKING (FORM 3) [07-06-2022(online)].pdf 2022-06-07
4 202217032579-REQUEST FOR EXAMINATION (FORM-18) [07-06-2022(online)].pdf 2022-06-07
4 202217032579-GPA-091122.pdf 2022-12-08
4 202217032579-certified copy of translation [06-12-2022(online)].pdf 2022-12-06
5 202217032579-PRIORITY DOCUMENTS [07-06-2022(online)].pdf 2022-06-07
5 202217032579-CLAIMS [06-12-2022(online)].pdf 2022-12-06
5 202217032579-ABSTRACT [06-12-2022(online)].pdf 2022-12-06
6 202217032579-POWER OF AUTHORITY [07-06-2022(online)].pdf 2022-06-07
6 202217032579-COMPLETE SPECIFICATION [06-12-2022(online)].pdf 2022-12-06
6 202217032579-certified copy of translation [06-12-2022(online)].pdf 2022-12-06
7 202217032579-NOTIFICATION OF INT. APPLN. NO. & FILING DATE (PCT-RO-105-PCT Pamphlet) [07-06-2022(online)].pdf 2022-06-07
7 202217032579-FER_SER_REPLY [06-12-2022(online)].pdf 2022-12-06
7 202217032579-CLAIMS [06-12-2022(online)].pdf 2022-12-06
8 202217032579-COMPLETE SPECIFICATION [06-12-2022(online)].pdf 2022-12-06
8 202217032579-FORM 18 [07-06-2022(online)].pdf 2022-06-07
8 202217032579-FORM 3 [06-12-2022(online)].pdf 2022-12-06
9 202217032579-Correspondence-041122.pdf 2022-12-05
9 202217032579-FER_SER_REPLY [06-12-2022(online)].pdf 2022-12-06
9 202217032579-FORM 1 [07-06-2022(online)].pdf 2022-06-07
10 202217032579-DRAWINGS [07-06-2022(online)].pdf 2022-06-07
10 202217032579-FORM 3 [06-12-2022(online)].pdf 2022-12-06
10 202217032579-GPA-041122.pdf 2022-12-05
11 202217032579-Correspondence-041122.pdf 2022-12-05
11 202217032579-DECLARATION OF INVENTORSHIP (FORM 5) [07-06-2022(online)].pdf 2022-06-07
11 202217032579-FORM-26 [03-11-2022(online)].pdf 2022-11-03
12 202217032579-COMPLETE SPECIFICATION [07-06-2022(online)].pdf 2022-06-07
12 202217032579-FER.pdf 2022-10-18
12 202217032579-GPA-041122.pdf 2022-12-05
13 202217032579-Verified English translation [21-06-2022(online)].pdf 2022-06-21
13 202217032579-FORM-26 [13-07-2022(online)].pdf 2022-07-13
13 202217032579-FORM-26 [03-11-2022(online)].pdf 2022-11-03
14 202217032579-FER.pdf 2022-10-18
14 202217032579-Proof of Right [21-06-2022(online)].pdf 2022-06-21
15 202217032579-FORM-26 [13-07-2022(online)].pdf 2022-07-13
15 202217032579-Verified English translation [21-06-2022(online)].pdf 2022-06-21
16 202217032579-COMPLETE SPECIFICATION [07-06-2022(online)].pdf 2022-06-07
16 202217032579-FER.pdf 2022-10-18
16 202217032579-Proof of Right [21-06-2022(online)].pdf 2022-06-21
17 202217032579-Verified English translation [21-06-2022(online)].pdf 2022-06-21
17 202217032579-DECLARATION OF INVENTORSHIP (FORM 5) [07-06-2022(online)].pdf 2022-06-07
17 202217032579-FORM-26 [03-11-2022(online)].pdf 2022-11-03
18 202217032579-GPA-041122.pdf 2022-12-05
18 202217032579-DRAWINGS [07-06-2022(online)].pdf 2022-06-07
18 202217032579-COMPLETE SPECIFICATION [07-06-2022(online)].pdf 2022-06-07
19 202217032579-Correspondence-041122.pdf 2022-12-05
19 202217032579-DECLARATION OF INVENTORSHIP (FORM 5) [07-06-2022(online)].pdf 2022-06-07
19 202217032579-FORM 1 [07-06-2022(online)].pdf 2022-06-07
20 202217032579-FORM 3 [06-12-2022(online)].pdf 2022-12-06
20 202217032579-FORM 18 [07-06-2022(online)].pdf 2022-06-07
20 202217032579-DRAWINGS [07-06-2022(online)].pdf 2022-06-07
21 202217032579-FER_SER_REPLY [06-12-2022(online)].pdf 2022-12-06
21 202217032579-FORM 1 [07-06-2022(online)].pdf 2022-06-07
21 202217032579-NOTIFICATION OF INT. APPLN. NO. & FILING DATE (PCT-RO-105-PCT Pamphlet) [07-06-2022(online)].pdf 2022-06-07
22 202217032579-COMPLETE SPECIFICATION [06-12-2022(online)].pdf 2022-12-06
22 202217032579-FORM 18 [07-06-2022(online)].pdf 2022-06-07
22 202217032579-POWER OF AUTHORITY [07-06-2022(online)].pdf 2022-06-07
23 202217032579-CLAIMS [06-12-2022(online)].pdf 2022-12-06
23 202217032579-NOTIFICATION OF INT. APPLN. NO. & FILING DATE (PCT-RO-105-PCT Pamphlet) [07-06-2022(online)].pdf 2022-06-07
23 202217032579-PRIORITY DOCUMENTS [07-06-2022(online)].pdf 2022-06-07
24 202217032579-certified copy of translation [06-12-2022(online)].pdf 2022-12-06
24 202217032579-POWER OF AUTHORITY [07-06-2022(online)].pdf 2022-06-07
24 202217032579-REQUEST FOR EXAMINATION (FORM-18) [07-06-2022(online)].pdf 2022-06-07
25 202217032579-ABSTRACT [06-12-2022(online)].pdf 2022-12-06
25 202217032579-PRIORITY DOCUMENTS [07-06-2022(online)].pdf 2022-06-07
25 202217032579-STATEMENT OF UNDERTAKING (FORM 3) [07-06-2022(online)].pdf 2022-06-07
26 202217032579-TRANSLATIOIN OF PRIOIRTY DOCUMENTS ETC. [07-06-2022(online)].pdf 2022-06-07
26 202217032579-REQUEST FOR EXAMINATION (FORM-18) [07-06-2022(online)].pdf 2022-06-07
26 202217032579-GPA-091122.pdf 2022-12-08
27 202217032579.pdf 2022-06-07
27 202217032579-STATEMENT OF UNDERTAKING (FORM 3) [07-06-2022(online)].pdf 2022-06-07
27 202217032579-Correspondence-091122.pdf 2022-12-08
28 202217032579-PatentCertificate03-12-2024.pdf 2024-12-03
28 202217032579-TRANSLATIOIN OF PRIOIRTY DOCUMENTS ETC. [07-06-2022(online)].pdf 2022-06-07
29 202217032579-IntimationOfGrant03-12-2024.pdf 2024-12-03
29 202217032579.pdf 2022-06-07

Search Strategy

1 Search_StrategyE_17-10-2022.pdf
1 Search_Strategy_amendedAE_22-02-2024.pdf
2 Search_StrategyE_17-10-2022.pdf
2 Search_Strategy_amendedAE_22-02-2024.pdf

ERegister / Renewals

3rd: 28 Feb 2025

From 20/11/2022 - To 20/11/2023

4th: 28 Feb 2025

From 20/11/2023 - To 20/11/2024

5th: 28 Feb 2025

From 20/11/2024 - To 20/11/2025

6th: 07 Oct 2025

From 20/11/2025 - To 20/11/2026