Sign In to Follow Application
View All Documents & Correspondence

Low Speed Operation Assist System For Driver Assistance

Abstract: Described herein a low-speed operation (LSO) driver assist system (102). The LSO driver assist system (102) includes an information processing engine (410) that receives sensing data (420) from one or more sensors (104) and video cameras (106) coupled to the LSO driver assist system (102), determines whether the ego vehicle (304) is in a traffic congestion condition, and ascertains whether a current velocity of the ego vehicle (304) is different from a target velocity to be achieved by the ego vehicle (304). Further, the LSO driver assist system (102) includes a LSO implementation engine (412) that initiates the LSO functionality on the Ego vehicle (304) when the current velocity of the ego vehicle is ascertained different from the target velocity to be achieved by the ego vehicle (304), transmits one or more signals to one of a brake assist controller and an acceleration controller, and based on the one or more signals, actuates one of the brake assist controller and the acceleration controller so as to control the acceleration of the ego vehicle (304) during the traffic congestion condition. [[TO BE PUBLISHED WITH FIGS. 5 and 6]]

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
15 February 2018
Publication Number
35/2019
Publication Type
INA
Invention Field
ELECTRONICS
Status
Email
lsdavar@ndf.vsnl.net.in
Parent Application

Applicants

MARUTI SUZUKI INDIA LIMITED
1 Nelson Mandela Road, Vasant Kunj, New Delhi-110070, India.

Inventors

1. TARUN AGGARWAL
Maruti Suzuki India Limited, Palam Gurugram Road, Gurugram, Haryana-122015, India.
2. MANDAL, SANDEEP
Maruti Suzuki India Limited, Palam Gurugram Road, Gurugram, Haryana-122015, India.
3. LALWANI, DINESH KUMAR
Maruti Suzuki India Limited, Palam Gurugram Road, Gurugram, Haryana-122015, India.
4. GOSAIN, AVNISH
Maruti Suzuki India Limited, Palam Gurugram Road, Gurugram, Haryana-122015, India.
5. GUPTA, MUDIT
Maruti Suzuki India Limited, Palam Gurugram Road, Gurugram, Haryana-122015, India.

Specification

TECHNICAL FIELD
[0001] The present disclosure relates generally to the field of automobiles. In particular, the present disclosure relates to a low-speed operation (LSO) driver assist system for the driver during slow speed of a vehicle or during traffic jam conditions.
BACKGROUND
[0002] Background description includes information that may be useful in understanding the present disclosure. It is not an admission that any of the information provided herein is prior art or relevant to the presently claimed subject matter, or that any publication specifically or implicitly referenced is prior art.
[0003] Due to the rapid growth of the automobile industry and per capita income of human beings, a number of car user and owners has been increasing every year at a very large scale. An increasing number of cars/automobiles on the road is creating the worst traffic congestion, especially in large cities or metro cities. Due to long hour traffic congestions, it has become a prominent problem for the urban development and automobile users. In traffic congestions, the vehicle moves very slowly with several stops and go actions. In the traffic congestions, the driver of the vehicle must be highly attentive to avoid front-rear and side collision with surrounding vehicles. Traffic congestions adversely affect the driver physical and psychological condition. Further creeping in the traffic condition makes the driver impatient and frustrated. Furthermore, continuous pressing of the clutch pedal and brake pedal create driving fatigue to the driver.
[0004] Therefore, in order to overcome the deficiencies of the existing systems, it is required to have an automated driver assist system which can assist the driver during low-speed operation or during traffic congestion condition.
3
OBJECTS OF THE DISCLOSURE
[0005] Some of the objects of the present disclosure, which at least one embodiment herein satisfy, are listed hereinbelow.
[0006] It is a general object of the present disclosure to provide an automated driver assist system which can assist the driver during low-speed operation or during traffic congestion condition.
[0007] It is another object of the present disclosure to provide a system and method that allows a driver of a vehicle to perform the low-speed operation with minimal driving fatigue.
[0008] It is another object of the present disclosure to provide a system and method for enabling a lesser experienced driver to choose to perform a low-speed operation of a vehicle during traffic congestion condition.
[0009] It is yet another object of the present disclosure to provide a system and method for performing a more precise automated low-speed maneuver in comparison to manual low-speed maneuver.
[0010] These and other objects and advantages of the present invention will be apparent to those skilled in the art after a consideration of the following detailed description taken in conjunction with the accompanying drawings in which a preferred form of the present invention is illustrated.
SUMMARY
[0011] This summary is provided to introduce concepts related to systems and methods for operating a low-speed operation (LSO) driver assist system. The concepts are further described below in the detailed description. This summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter.
[0012] The present disclosure relates to a low-speed operation (LSO) driver assist system. The LSO driver assist system includes a non-transitory storage device
4
having embodied therein one or more routines operable to enable LSO functionality on an ego vehicle, and one or more processors coupled to the non-transitory storage device and operable to execute the one or more routines. The one or more routines include an information processing engine and a LSO implementation engine. The information processing engine, which when executed by the one or more processors, receives sensing data from one or more sensors and video cameras coupled to the LSO driver assist system, determines whether the ego vehicle is in a traffic congestion condition, and ascertains whether a current velocity of the ego vehicle is different from a target velocity to be achieved by the ego vehicle. Thereafter, the LSO implementation engine, which when executed by the one or more processors, initiates the LSO functionality on the Ego vehicle when the current velocity of the ego vehicle is ascertained different from the target velocity to be achieved by the ego vehicle, transmits one or more signals to one of a brake assist controller and an acceleration controller, and based on the one or more signals, actuates one of the brake assist controller and the acceleration controller so as to control the acceleration of the ego vehicle during the traffic congestion condition.
[0013] The present disclosure further relates to a method for operating a Low-Speed Operation (LSO) driver assist system. The method includes receiving, at the LSO driver assist system, sensing data from one or more sensors and video cameras coupled to the LSO driver assist system; determining whether the ego vehicle is in a traffic congestion condition; ascertaining whether a current velocity of the ego vehicle is different from a target velocity to be achieved by the ego vehicle; initiating LSO functionality on the Ego vehicle when the current velocity of the ego vehicle is ascertained different from the target velocity to be achieved by the ego vehicle; transmitting one or more signals from the LSO driver assist system to one of a brake assist controller and an acceleration controller; and based on the one or more signals, actuating one of the brake assist controller and the acceleration controller so as to control the acceleration of the ego vehicle during the traffic congestion condition.
5
[0014] In an aspect, the one or more signals include a current velocity, a current acceleration, a required velocity to be achieved, and a required calculated time in which the required velocity is to be achieved by the ego vehicle.
[0015] In an aspect, the LSO implementation engine is to automate the implementation of the LSO functionality on the ego vehicle based on the sensing data received from the one or more sensors and the video cameras coupled to the LSO driver assist system.
[0016] In an aspect, the one or more sensors are selected from any or a combination of a radar sensor, an ultrasonic sensor, a proximity sensor, a global positioning system (GPS) sensor, a LIDAR sensor, an ultrasound sensor, and a video sensor configured on/in the ego vehicle.
[0017] In an aspect, based on the sensing data, the information processing engine computes any or a combination of environmental attributes, relative distance/velocity between the ego vehicle and other vehicles, and attributes of the ego vehicle and/or of the other vehicles, while making a decision to initiate the trigger the implementation of the LSO functionality on the ego vehicle.
[0018] In an aspect, based on the one or more signals, the brake assist controller applies the brakes to the ego vehicle to decelerate the ego vehicle.
[0019] In an aspect, based on the one or more signals, the acceleration controller applies the accelerate to the ego vehicle to accelerate the ego vehicle.
[0020] In an aspect, based on the one or more signals, the LSO implementation engine actuates an electronic power steering module to adjust the angle of a steering wheel of the ego vehicle.
[0021] In an aspect, the LSO driver assist system includes LSO deactivation engine to deactivate the LSO functionality when there is no target vehicle in a detection zone of the ego vehicle for a specified threshold time.
[0022] Various objects, features, aspects, and advantages of the inventive subject matter will become more apparent from the following detailed description
6
of preferred embodiments, along with the accompanying drawing figures in which like numerals represent like components.
[0023] It is to be understood that the aspects and embodiments of the disclosure described above may be used in any combination with each other. Several of the aspects and embodiments may be combined to form a further embodiment of the disclosure.
[0024] The foregoing summary is illustrative only and is not intended to be in any way limiting. In addition to the illustrative aspects, embodiments, and features described above, further aspects, embodiments, and features will become apparent by reference to the drawings and the following detailed description.
BRIEF DESCRIPTION OF THE DRAWINGS
[0025] The illustrated embodiments of the subject matter will be best understood by reference to the drawings, wherein like parts are designated by like numerals throughout. The following description is intended only by way of example, and simply illustrates certain selected embodiments of devices, systems, and methods that are consistent with the subject matter as claimed herein, wherein:
[0026] FIG. 1 illustrates an exemplary low-speed operation (LSO) driver assist system in accordance with an exemplary implementation;
[0027] FIG. 2 illustrates a flowchart of a process of activation of the LSO driver assist system, in accordance with an embodiment of the present disclosure;
[0028] FIG. 3 illustrates different categories of distance zones between a target vehicle and an ego vehicle, in accordance with an embodiment of the present disclosure;
[0029] FIG. 4 illustrates exemplary components of the LSO driver assist system, in accordance with an exemplary embodiment of the present disclosure;
[0030] FIG. 5 illustrates an exemplary block diagram of the LSO driver assist system, in accordance with an exemplary embodiment of the present disclosure;
7
[0031] FIG. 6 illustrates a flowchart of a process of an exemplary implementation of the LSO system, in accordance with an embodiment of the present disclosure; and
[0032] FIG. 7 illustrates a method of operating the LSO driver assist system, in accordance with an embodiment of the present disclosure
[0033] The figures depict embodiments of the disclosure for purposes of illustration only. One skilled in the art will readily recognize from the following description that alternative embodiments of the structures and methods illustrated herein may be employed without departing from the principles of the disclosure described herein.
DETAILED DESCRIPTION
[0034] The detailed description of various exemplary embodiments of the disclosure is described herein with reference to the accompanying drawings. It should be noted that the embodiments are described herein in such details as to clearly communicate the disclosure. However, the amount of details provided herein is not intended to limit the anticipated variations of embodiments; on the contrary, the intention is to cover all modifications, equivalents, and alternatives falling within the spirit and scope of the present disclosure as defined by the appended claims.
[0035] It is also to be understood that various arrangements may be devised that, although not explicitly described or shown herein, embody the principles of the present disclosure. Moreover, all statements herein reciting principles, aspects, and embodiments of the present disclosure, as well as specific examples, are intended to encompass equivalents thereof.
[0036] The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of example embodiments. As used herein, the singular forms “a”, “an” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms “comprises”, “comprising”, “includes” and/or
8
“including,” when used herein, specify the presence of stated features, integers, steps, operations, elements and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components and/or groups thereof.
[0037] It should also be noted that in some alternative implementations, the functions/acts noted may occur out of the order noted in the figures. For example, two figures shown in succession may, in fact, be executed concurrently or may sometimes be executed in the reverse order, depending upon the functionality/acts involved.
[0038] Unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which example embodiments belong. It will be further understood that terms, e.g., those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of the relevant art and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein.
[0039] Embodiments explained herein pertain to operating an advanced driver assist system of a vehicle, wherein the advanced driver assist system being capable of performing a low-speed operation during traffic congestion condition so as to avoid driving fatigue and to maintain psychology of the driver of the vehicle.
[0040] FIG. 1 illustrates a low-speed operation (LSO) driver assist system 102 which automates the implementation of vehicle maneuver during traffic congestion condition, according to an embodiment of the present disclosure. The LSO driver assist system 102, also referred to as system 102 hereinafter, may be considered as any driver assist system which performs one or more assistance functions such as LDW (Lane Departure Warning), LKS (Lane Keeping Support), LCA (Lane Change Assistant), acceleration control, and braking assistant, in the vehicles. For implementing these assistance functions, the system 102 includes surroundings sensors 104 such as, for example, ultrasonic sensor, radar sensors, proximity sensors, global positioning system (GPS) sensors, LIDAR (Light Detection and
9
Ranging) sensors, ultrasound sensors. Since the surrounding sensors are well known in the art, the detailed working or operation of these sensors are not described in the present disclosure for the sake of brevity.
[0041] Further, although the sensors 104 are shown as a part of the system 102 in FIG. 1; however, the sensors 104 can be disposed anywhere in the vehicle and can be coupled to the system 102 without deviating from the scope of the present disclosure. In an example, the sensors 104 may be mounted on the exterior side of a vehicle in order to sense surrounding conditions of the system 102.
[0042] Yet further, if the vehicle is equipped with a navigation system, the system 102 may also resort to data of this navigation system. Since the system 102 is generally connected to a vehicle electric system via at least one bus, preferably controller area network (CAN) bus, the system 102 may actively intervene in the vehicle systems such as steering system, brake system, drive train system (acceleration systems), and warning systems. Drivers faced with such functions of the system 102 initially often feel overwhelmed and uneasy because they have the impression that they are disempowered by the vehicle, since many functions, actually serving the driver's safety, run without any action by the driver.
[0043] Continuing with the present embodiment, the system 102 may further include one or more video camera 106 to capture the images of the surrounding conditions of the system 102. In an aspect, the video camera 106 may be coupled to the sensors 104 to determine the position of other vehicle (target vehicle) from the vehicle (EGO vehicle) equipped with the system 102.
[0044] The system 102 may further include a driver assist engine(s) 108. The driver assists engine(s) 108 may be implemented as a combination of hardware and programming (for example, programmable instructions) to implement one or more functionalities of the driver assist engine(s) 108. In examples described herein, such combinations of hardware and programming may be implemented in several different ways. For example, the programming for the driver assist engine(s) 108 may be processor executable instructions stored on a non-transitory machine-readable storage medium and the hardware for the driver assist engine(s) 108 may
10
include a processing resource (for example, one or more processors), to execute such instructions. In the present examples, the machine-readable storage medium may store instructions that, when executed by the processing resource, implement the driver assist engine(s) 108. In such examples, the system 102 may include the machine-readable storage medium storing the instructions and the processing resource to execute the instructions, or the machine-readable storage medium may be separate but accessible to the system 102 and the processing resource. In other examples, driver assist engine(s) 108 may be implemented by electronic circuitry.
[0045] In one implementation, the system 102 is activated upon receipt of an instruction, at the driver assist engine(s) 108, from a driver of an EGO vehicle. The driver of the EGO vehicle may provide the instruction when the driver wants to initiate LSO functionality on the EGO vehicle, during a traffic congestion condition. In an example, the driver can provide the instruction by pressing an activation button 502 (FIG. 5) associated with the driver assist engine(s) 108.
[0046] In another implementation, during auto-mode activation, the system 102 is activated on its own in case the system 102 detects traffic congestion in the vehicle surrounding. Once the system 102 is activated, the LSO functionality is activated on the EGO vehicle.
[0047] Referring now to FIG. 2 illustrates a flowchart of a process of activation of the system 102, at step 202, once the LSO functionality is activated, the driver assist engine(s) 108 may check in case the traffic light is not RED (Step 204). In case the traffic light is detected as RED, the driver assist engine(s) 108 checks whether the system 102 is already active (Step 206) and, if the system 102 is active, deactivates the system 102 in case the system 102 is active (Step 208). Also, while deactivating the system 102, the driver assist engine(s) 108 sends an alert to the driver of the EGO vehicle that the system 102 is deactivated due to inappropriate surrounding conditions.
[0048] However, in case the traffic light is detected as not RED, the driver assist engine(s) 108 checks whether the steering angle is greater than the threshold steering value (Step 210). In case the steering angle is determined greater than the
11
threshold steering value, the driver assist engine(s) 108 deactivates the system 102 and sends an alert to the driver of the EGO vehicle (Step 208).
[0049] Otherwise, in case the steering angle is determined smaller than the threshold steering value, the driver assist engine(s) 108 checks whether the driver interference is for less than a threshold time (Step 212). If the driver interference is for more than threshold time during activation, the driver assist engine(s) 108 deactivates the system 102 and sends an alert to the driver of the EGO vehicle (Step 208).
[0050] And, if the driver interference is not for more than threshold time during activation, the driver assist engine(s) 108 checks whether the distance between the EGO vehicle and the target vehicle is within a predefined range (Step 214). In case the distance between the EGO vehicle and the target vehicle is outside the predefined range, the driver assist engine(s) 108 activates the system 102 (Step 216). However, in an exemplary implementation, the driver assist engine(s) 108 monitors that whether the distance between the EGO vehicle and the target vehicle is outside the predefined range for more than a predefined time threshold. In case the distance between the EGO vehicle and the target vehicle is outside the predefined range for more than the predefined time threshold (218), the driver assist engine(s) 108 deactivates the system 102 and sends an alert to the driver of the EGO vehicle (Step 208).
[0051] Otherwise, in case the distance between the EGO vehicle and the target vehicle is within the predefined range at Step 214, the driver assist engine(s) 108 activates the system 102 and sends an alert to the driver of the EGO vehicle (Step 220).
[0052] FIG. 3 illustrates different categories of distance zones between a target vehicle 302 and an ego vehicle 304, in accordance with an embodiment of the present disclosure. As shown in FIG. 3, the system 102 or the driver assist engine(s) 108 calculates the distance between the target vehicle 302 and the ego vehicle 304 based on the inputs received from the sensors 104 and the video cameras 106. In an aspect, the system 102 or the driver assist engine(s) 108 calculates an actual
12
(current) distance Xt between the target vehicle 302 and the ego vehicle 304 based on the detection distance Xd and the safe distance X between the target vehicle 302 and the ego vehicle 304.
[0053] In an aspect, the system 102 or the driver assist engine(s) 108 divides the detected or determined distance into four zones, i.e., a crash zone 306, a danger zone 308, a detection zone 310, and an undetected zone 312. These zones are determined based on the distance between the target vehicle 302 and the ego vehicle 304. For example, when the target vehicle 302 and the ego vehicle 304 are at a distance defining crash zone 306 or danger zone 308, the system 102 or the driver assist engine(s) 108 does not ascertain these zones as safe zones. Beyond the distance of the crash zone 306 or danger zone 308, the detection distance Xd is the safe distance X between the target vehicle 302 and the ego vehicle 304.
[0054] Upon determination of the actual (current) distance Xt between the target vehicle 302 and the ego vehicle 304, the system 102 or the driver assist engine(s) 108 calculates the velocity Vt of the target vehicle 302. Based on the distance and the velocity Vt of the target vehicle 302, the system 102 or the driver assist engine(s) 108 calculates the required acceleration for the ego vehicle 304 and time for the acceleration/deceleration. The system 102 or the driver assist engine(s) 108 applies the acceleration/deceleration accordingly on an engine control module (ECM) 110 and electronic stability program (ESP) 112, sends feedback acceleration value and feedback velocity value on controller area network (CAN).
[0055] Apart from the ECM 110 and the ESP 112, the system further includes an electronic power steering (EPS) module 114 to maneuver of steering of the vehicle during the implementation of the system 102. The working operation of the system 102 is described in detail with reference to FIGS. 4-6.
[0056] FIG. 4 illustrates the functional components of the system 102 proposed herein according to another embodiment of the present disclosure. The system 102 includes one or more processor(s) 402. The one or more processor(s) 402 may be implemented as one or more microprocessors, microcomputers, microcontrollers,
13
digital signal processors, central processing units, logic circuitries, and/or any devices that manipulate data based on operational instructions.
[0057] Among other capabilities, the one or more processor(s) 402 are configured to fetch and execute computer-readable instructions stored in a memory 404 of the system 102. The memory 404 may store one or more computer-readable instructions, which may be fetched and executed to automate the implementation of the LSO on the Ego vehicle 304. The memory 204 may include any non-transitory storage device including, for example, volatile memory such as RAM, or non-volatile memory such as EPROM, flash memory, and the like.
[0058] The system 102 also includes interface(s) 406. The interface(s) 406 may include a variety of interfaces, for example, interfaces for data input and output devices, referred to as I/O devices, storage devices, the sensors 104, the cameras 106, the ECM 110, the ESP 112, the EPS 114, and the like. The interface(s) 406 facilitate communication of the system 102 with various devices mounted within a vehicle. The interface(s) 406 may also provide a communication pathway for one or more components of the system 102 over CAN bus. Examples of such components include, but are not limited to, the driver assist engine(s) 108.
[0059] In an implementation, the driver assist engine(s) 108 further includes a LSO initialization engine 408, an information processing engine 410, a LSO implementation engine 412, a LSO deactivation engine 414, and other engine(s) 416. The other engine(s) 416 may include programs or coded instructions that supplement applications and functions of the system 102.
[0060] Further, in the present embodiment, the system 102 includes data 418 that is either stored or generated as a result of functionalities implemented by any of the engine(s) of the driver assist engine(s) 108. Further, the data 418 may include sensing data 420, computation data 422, and other data 424. It should be noted that although the present approach has been described in the context of a driver assist system, it may be also implemented on any other device with a programmable memory and a processor, without deviating from the scope of the present subject matter.
14
[0061] In operation, as shown in FIGS. 5 and 6, when a driver of the ego vehicle 304 wants to implement the LSO functionality on the ego vehicle 304 during the traffic congestion condition, the driver activates the driver assist engine(s) 108. The activation is performed by providing the instruction of initiating the LSO functionality to the LSO initialization engine 408. As described above, in an example, the driver can provide the instruction by pressing the LSO activation button 520 associated with either the driver assist engine(s) 108 or the system 102.
[0062] Upon activation, the driver assist engine(s) 108 is initialized by the LSO initialization engine 208. Thereafter, the information processing engine 410 of the driver assist engine(s) 108 retrieves the sensing data 420. In an aspect, the sensing data 420 may be understood as data sensed by sensors 104 such as, for example, radar sensors, proximity sensors, global positioning system (GPS) sensors, LIDAR sensors, ultrasound sensors, and video sensors, and the data captured by the video cameras 106. Based on the sensing data 420, the information processing engine 410 calculates the actual (current) distance Xt between the target vehicle 302 and the ego vehicle 304. Using the calculated actual distance Xt between the target vehicle 302 and the ego vehicle 304, the information processing engine 410 calculates the velocity Vt of the target vehicle 302 and the velocity Ve of the ego vehicle 304. In case the current velocity and the target velocity to be achieved by the ego vehicle 304 are same, the LSO functionality of the LSO driver assistance system 102 would not be implemented.
[0063] In an aspect, as shown in Step 602 of FIG. 6, in case the current velocity and the target velocity to be achieved by the ego vehicle 304 are different, the LSO implementation engine 412 implements a logic to compute any or a combination of environmental attributes, relative distance/velocity between the ego vehicle 304 and the target vehicle 302, attributes of the ego vehicle 304 and/or of the target vehicle 302, and the other conditions as mentioned above with reference to flowchart depicted in FIG. 2, in order to determine whether a safe surrounding condition exists to automate the implementation of the LSO operation. In an aspect, during determination, the LSO implementation engine 412 performs certain pre-checks to
15
confirm that the LSO functionality can be activated. The pre-check is done to ensure that the driver has not steered the ego vehicle 304 above a certain threshold value so that the vehicle is not moved out of the current path required by the LSO functionality.
[0064] In an aspect, after LSO functionality activation, the driver is given certain time to leave the control to the LSO functionality. During this waiting time, the LSO functionality can be actuated, if required, by the LSO implementation engine 412. After the lapse of this waiting period, if the driver continues interferes either by braking or accelerating manually, then LSO deactivation engine 414 will deactivate the LSO functionality until next manual or automatic activation method is performed for activation of the LSO functionality. However, before deactivating the LSO functionality, the driver of the ego vehicle 304 is warned about the LSO functionality deactivation.
[0065] In another aspect, in the case when no manual intervention by the driver is detected during initialization of the LSO functionality and the other pre-checks are fulfilled as required, the LSO functionality will remain active.
[0066] Once the LSO functionality is activated, the LSO implementation engine 412 determines whether the target vehicle 302 lies in one of the detection zones (FIG. 3) of the ego vehicle 304 and ascertains the calculated distance between target vehicle 302 and the ego vehicle 304 and relative speed. Based on the determined values, at step 604 of FIG. 6, the LSO implementation engine 412 transmits signals from the LSO driver assist engine(s) 108 to one of an acceleration controller (comprising the ECM 110), a brake assist controller (comprising the ESP 112), and, if required, an EPS controller (comprising EPS module 114).
[0067] In an aspect as shown in FIG. 5, in case a steering angle of a steering wheel is determined outside the threshold steering value, the LSO implementation engine 412 transmits a steering input to the EPS module 114 through a steering angle input sensor 504.
16
[0068] In another aspect, as step 604 of FIG. 6, in case there is difference in the velocity of the target vehicle 302 and the ego vehicle 304, the LSO implementation engine 412 transmits, to the acceleration controller and the brake assist controller, a current velocity of the ego vehicle 304, a current acceleration of the ego vehicle 304, a target velocity for the ego vehicle 304, and a time required to reach the target velocity by the ego vehicle 304.
[0069] Based on the signals received from the LSO implementation engine 412, the brake assist controller ascertains whether braking is required (Step 606 of FIG. 6). Based on the ascertainment, the brake assist controller calculates the required deceleration and transmits a brake actuation input to the ESP 112 through a brake wheel speed sensor 506 for applying the brakes. After application of the brake, the brake assist controller sends feedback acceleration (retardation) value and feedback velocity value to the LSO implementation engine 412 (Step 608 of FIG. 6).
[0070] Similarly, based on the signals received from the LSO implementation engine 412, the acceleration controller ascertains whether acceleration is required (Step 610 of FIG. 6). Based on the ascertainment, the acceleration controller calculates the required acceleration and transmits an acceleration input to the ECM 110 through a throttle position input sensor 508 and an acceleration pedal input sensor 510 for applying the acceleration. After application of the acceleration, the acceleration controller receives acceleration value from the vehicle speed sensor 512 so as to send feedback acceleration value and feedback velocity value to the LSO implementation engine 412 (Step 612 of FIG. 6).
[0071] Accordingly, when there is need of acceleration or retardation, then the corresponding controller (Brake Assist Controller / Acceleration Controller) is provided with a current velocity, current acceleration, required velocity to be achieved, and the required calculated time in which the required velocity should be achieved. The respective controllers shall calculate the required acceleration or deceleration and actuate accordingly. Also, the controllers will provide the feedback of the achieved acceleration/retardation and the velocity reached the sampling points. The difference in target and achieved values shall decrease with
17
each actuation cycle till both values become equal. This handshake between the LSO implementation engine 412 and the brake assist controller / acceleration controller shall persist until required velocity is achieved or the LSO functionality is deactivated by manual or automatic intervention.
[0072] Thus, the functionality shall remain active until either manually deactivated by the driver or the driver interferes the system by manually applying brake or acceleration. The functionality will also be deactivated if there is no target vehicle in the detection zone of the ego vehicle for a certain threshold time.
[0073] FIG. 8 illustrates a method 700 for operating a low-speed operation (LSO) driver assist system 102, according to an embodiment of the present disclosure. The order in which the method 700 is described is not intended to be construed as a limitation, and any number of the described method blocks can be combined in any appropriate order to carry out the method 700 or an alternative method. Additionally, individual blocks may be deleted from the method 700 without departing from the scope of the subject matter described herein.
[0074] At block 702, the method 700 includes receiving, at the LSO driver assist system 102, sensing data 420 from one or more sensors 104 and video cameras 106 coupled to the LSO driver assist system 102. In an aspect, the one or more sensors 104 are selected from any or a combination of a radar sensor, an ultrasonic sensor, a proximity sensor, a global positioning system (GPS) sensor, a LIDAR sensor, an ultrasound sensor, and a video sensor configured on/in the ego vehicle 304.
[0075] At block 704, the method 700 includes determining whether the ego vehicle 304 is in a traffic congestion condition.
[0076] At block 706, the method 700 includes ascertaining whether a current velocity of the ego vehicle 304 is different from a target velocity to be achieved by the ego vehicle 304.
18
[0077] At block 708, the method 700 includes initiating a LSO functionality on the ego vehicle 304 when the current velocity of the ego vehicle is ascertained different from the target velocity to be achieved by the ego vehicle 304.
[0078] At block 710, the method 700 includes transmitting one or more signals from the LSO driver assist system 102 to one of a brake assist controller and an acceleration controller. In an aspect, the one or more signals include a current velocity, a current acceleration, a required velocity to be achieved, and a required calculated time in which the required velocity is to be achieved by the ego vehicle.
[0079] At block 712, the method 700 includes, based on the one or more signals, actuating one of the brake assist controller and the acceleration controller so as to control the acceleration of the ego vehicle 304 during the traffic congestion condition.
[0080] In an aspect, the method 700 further includes automating the implementation of the LSO functionality on the ego vehicle 304 based on the sensing data 420 received from the one or more sensors 104 and the video cameras 106 coupled to the LSO driver assist system 102.
[0081] In an aspect, based on the sensing data 104, the method 700 further includes computing any or a combination of environmental attributes, relative distance/velocity between the ego vehicle and other vehicles 302, and attributes of the ego vehicle 304 and/or of the other vehicles 302, while making a decision to initiate the trigger the implementation of the LSO functionality on the ego vehicle 304.
[0082] In an aspect, based on the one or more signals, the method 700 further includes applying brakes, by the brake assist controller, to the ego vehicle 304 to decelerate the ego vehicle 304.
[0083] In an aspect, based on the one or more signals, the method 700 further includes applying acceleration, by the acceleration controller, to the ego vehicle 304 to accelerate the ego vehicle 304.
19
[0084] In an aspect, based on the one or more signals, the method 700 further includes actuating an electronic power steering module 114 to adjust the angle of a steering wheel of the ego vehicle 304.
[0085] In an aspect, the method 700 further includes deactivating the LSO functionality when there is no target vehicle in a detection zone of the ego vehicle (304) for a specified threshold time.
[0086] Thus, with the implementation of the method 700 of the present subject matter, driver of the ego vehicle 304 having LOS driver assist system would be able to maneuver the ego vehicle through a traffic congestion area with easy and without any fatigue or mental strain.
[0087] The above description does not provide specific details of the manufacture or design of the various components. Those of skill in the art are familiar with such details, and unless departures from those techniques are set out, techniques, known, related art or later developed designs and materials should be employed. Those in the art can choose suitable manufacturing and design details.
[0088] It should be understood, however, that all of these and similar terms are to be associated with the appropriate physical quantities and are merely convenient labels applied to these quantities. Unless specifically stated otherwise, as apparent from the discussion herein, it is appreciated that throughout the description, discussions utilizing terms such as “receiving,” or “determining,” or “ascertaining,” or “initiating,” or “transmitting,” or the like, refer to the action and processes of an electronic control unit, or similar electronic computing device, that manipulates and transforms data represented as physical (electronic) quantities within the control unit’s registers and memories into other data similarly represented as physical quantities within the control unit memories or registers or other such information storage, transmission or display devices.
[0089] Further, the terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the disclosure. It will be appreciated that several of the above-disclosed and other features and
20
functions, or alternatives thereof, may be combined into other systems or applications. Various presently unforeseen or unanticipated alternatives, modifications, variations, or improvements therein may subsequently be made by those skilled in the art without departing from the scope of the present disclosure as encompassed by the following claims.
[0090] The claims, as originally presented and as they may be amended, encompass variations, alternatives, modifications, improvements, equivalents, and substantial equivalents of the embodiments and teachings disclosed herein, including those that are presently unforeseen or unappreciated, and that, for example, may arise from applicants/patentees and others.
[0091] It will be appreciated that variants of the above-disclosed and other features and functions, or alternatives thereof, may be combined into many other different systems or applications. Various presently unforeseen or unanticipated alternatives, modifications, variations, or improvements therein may be subsequently made by those skilled in the art which are also intended to be encompassed by the following claims.

We claim:
1. A Low-Speed Operation (LSO) driver assist system (102) comprising:
a non-transitory storage device (404) having embodied therein one or more routines operable to enable a LSO functionality on an ego vehicle (304); and
one or more processors (402) coupled to the non-transitory storage device (404) and operable to execute the one or more routines, wherein the one or more routines include:
an information processing engine (410), which when executed by the one or more processors (402),
receives sensing data (420) from one or more sensors (104) and video cameras (106) coupled to the LSO driver assist system (102),
determines whether the ego vehicle (304) is in a traffic congestion condition, and
ascertains whether a current velocity of the ego vehicle (304) is different from a target velocity to be achieved by the ego vehicle (304); and
a LSO implementation engine (412), which when executed by the one or more processors (402),
initiates the LSO functionality on the Ego vehicle (304) when the current velocity of the ego vehicle is ascertained different from the target velocity to be achieved by the ego vehicle (304),
transmits one or more signals to one of a brake assist controller and an acceleration controller, and
based on the one or more signals, actuates one of the brake assist controller and the acceleration controller so as to control

the acceleration of the ego vehicle (304) during the traffic congestion condition.
2. The LSO driver assist system (102) as claimed in claim 1, wherein the one or more signals include a current velocity, a current acceleration, a required velocity to be achieved, and a required calculated time in which the required velocity is to be achieved by the ego vehicle (304).
3. The LSO driver assist system (102) as claimed in claim 1, wherein the LSO implementation engine (412) is to automate implementation of the LSO functionality on the ego vehicle (304) based on the sensing data (420) received from the one or more sensors (104) and the video cameras (106) coupled to the LSO driver assist system (102).
4. The LSO driver assist system (102) as claimed in claim 1, wherein the one or more sensors (104) are selected from any or a combination of a radar sensor, an ultrasonic sensor, a proximity sensor, a global positioning system (GPS) sensor, a LIDAR sensor, an ultrasound sensor, and a video sensor configured on/in the ego vehicle (304).
5. The LSO driver assist system (102) as claimed in claim 1, wherein based on the sensing data (104), the information processing engine (410) computes any or a combination of environmental attributes, relative distance/velocity between the ego vehicle and other vehicles (302), and attributes of the ego vehicle (304) and/or of the other vehicles (302), while making a decision to initiate the trigger the implementation of the LSO functionality on the ego vehicle (304).

6. The LSO driver assist system (102) as claimed in claim 1, wherein based on the one or more signals, the brake assist controller applies the brakes to the ego vehicle (304) to decelerate the ego vehicle (304).
7. The LSO driver assist system (102) as claimed in claim 1, wherein based on the one or more signals, the acceleration controller applies the acceleration to the ego vehicle (304) to accelerate the ego vehicle (304).
8. The LSO driver assist system (102) as claimed in claim 1, wherein based on the one or more signals, the LSO implementation engine (412) actuates an electronic power steering module (114) to adjust the angle of a steering wheel of the ego vehicle (304).
9. The LSO driver assist system (102) as claimed in claim 1, comprises a LSO deactivation engine (412) to deactivate the LSO functionality when there is no target vehicle in a detection zone of the ego vehicle (304) for a specified threshold time.
10. A method for operating a Low-Speed Operation (LSO) driver assist system (102), the method comprising:
receiving, at the LSO driver assist system (102), sensing data (420) from one or more sensors (104) and video cameras (106) coupled to the LSO driver assist system (102);
determining whether the ego vehicle (304) is in a traffic congestion condition;
ascertaining whether a current velocity of the ego vehicle (304) is different from a target velocity to be achieved by the ego vehicle (304);

initiating a LSO functionality on the Ego vehicle (304) when the current velocity of the ego vehicle is ascertained different from the target velocity to be achieved by the ego vehicle (304);
transmitting one or more signals from the LSO driver assist system (102) to one of a brake assist controller and an acceleration controller; and
based on the one or more signals, actuating one of the brake assist controller and the acceleration controller so as to control the acceleration of the ego vehicle (304) during the traffic congestion condition.
11. The method as claimed in claim 10, wherein one or more signals include a current velocity, a current acceleration, a required velocity to be achieved, and a required calculated time in which the required velocity is to be achieved by the ego vehicle (304).
12. The method as claimed in claim 10, comprising automating the implementation of the LSO functionality on the ego vehicle (304) based on the sensing data (420) received from the one or more sensors (104) and the video cameras (106) coupled to the LSO driver assist system (102).
13. The method as claimed in claim 10, wherein the one or more sensors (104) are selected from any or a combination of a radar sensor, an ultrasonic sensor, a proximity sensor, a global positioning system (GPS) sensor, a LIDAR sensor, an ultrasound sensor, and a video sensor configured on/in the ego vehicle (304).
14. The method as claimed in claim 10, wherein based on the sensing data (104), the method comprising computing any or a combination of environmental attributes, relative distance/velocity between the ego vehicle and other

vehicles (302), and attributes of the ego vehicle (304) and/or of the other vehicles (302), while making a decision to initiate the trigger the implementation of the LSO functionality on the ego vehicle (304).
15. The method as claimed in claim 10, wherein based on the one or more signals, applying brakes, by the brake assist controller, to the ego vehicle (304) to decelerate the ego vehicle (304).
16. The method as claimed in claim 10, wherein based on the one or more signals, applying acceleration, by the acceleration controller, to the ego vehicle (304) to accelerate the ego vehicle (304).
17. The method as claimed in claim 10, wherein based on the one or more signals, actuating an electronic power steering module (114) to adjust the angle of a steering wheel of the ego vehicle (304).
18. The method as claimed in claim 10, comprising deactivating the LSO functionality when there is no target vehicle in a detection zone of the ego vehicle (304) for a specified threshold time.

Documents

Orders

Section Controller Decision Date
15 Gauri Shanker 2024-08-30
15 Gauri Shanker 2025-11-10

Application Documents

# Name Date
1 201811005824-Correspondence to notify the Controller [13-03-2025(online)].pdf 2025-03-13
1 201811005824-STATEMENT OF UNDERTAKING (FORM 3) [15-02-2018(online)]_43.pdf 2018-02-15
1 201811005824-Written submissions and relevant documents [03-04-2025(online)].pdf 2025-04-03
2 201811005824-Correspondence to notify the Controller [13-03-2025(online)].pdf 2025-03-13
2 201811005824-ReviewPetition-HearingNotice-(HearingDate-19-03-2025).pdf 2025-02-17
2 201811005824-STATEMENT OF UNDERTAKING (FORM 3) [15-02-2018(online)].pdf 2018-02-15
3 201811005824-FORM-24 [30-09-2024(online)].pdf 2024-09-30
3 201811005824-PROVISIONAL SPECIFICATION [15-02-2018(online)].pdf 2018-02-15
3 201811005824-ReviewPetition-HearingNotice-(HearingDate-19-03-2025).pdf 2025-02-17
4 201811005824-PROOF OF RIGHT [15-02-2018(online)]_15.pdf 2018-02-15
4 201811005824-FORM-24 [30-09-2024(online)].pdf 2024-09-30
4 201811005824-AMENDED DOCUMENTS [25-06-2024(online)].pdf 2024-06-25
5 201811005824-PROOF OF RIGHT [15-02-2018(online)].pdf 2018-02-15
5 201811005824-FORM 13 [25-06-2024(online)].pdf 2024-06-25
5 201811005824-AMENDED DOCUMENTS [25-06-2024(online)].pdf 2024-06-25
6 201811005824-POWER OF AUTHORITY [15-02-2018(online)]_9.pdf 2018-02-15
6 201811005824-POA [25-06-2024(online)].pdf 2024-06-25
6 201811005824-FORM 13 [25-06-2024(online)].pdf 2024-06-25
7 201811005824-Written submissions and relevant documents [14-03-2024(online)].pdf 2024-03-14
7 201811005824-POWER OF AUTHORITY [15-02-2018(online)].pdf 2018-02-15
7 201811005824-POA [25-06-2024(online)].pdf 2024-06-25
8 201811005824-Correspondence to notify the Controller [27-02-2024(online)].pdf 2024-02-27
8 201811005824-FORM 1 [15-02-2018(online)]_8.pdf 2018-02-15
8 201811005824-Written submissions and relevant documents [14-03-2024(online)].pdf 2024-03-14
9 201811005824-Correspondence to notify the Controller [27-02-2024(online)].pdf 2024-02-27
9 201811005824-FORM 1 [15-02-2018(online)].pdf 2018-02-15
9 201811005824-FORM-26 [27-02-2024(online)].pdf 2024-02-27
10 201811005824-FIGURE OF ABSTRACT [15-02-2018(online)]_51.jpg 2018-02-15
10 201811005824-FORM-26 [27-02-2024(online)].pdf 2024-02-27
10 201811005824-US(14)-HearingNotice-(HearingDate-01-03-2024).pdf 2024-02-05
11 201811005824-FER_SER_REPLY [02-12-2021(online)].pdf 2021-12-02
11 201811005824-FIGURE OF ABSTRACT [15-02-2018(online)].jpg 2018-02-15
11 201811005824-US(14)-HearingNotice-(HearingDate-01-03-2024).pdf 2024-02-05
12 201811005824-DRAWINGS [15-02-2018(online)]_3.pdf 2018-02-15
12 201811005824-FER_SER_REPLY [02-12-2021(online)].pdf 2021-12-02
12 201811005824-FORM 3 [02-12-2021(online)].pdf 2021-12-02
13 201811005824-FORM-26 [02-12-2021(online)].pdf 2021-12-02
13 201811005824-FORM 3 [02-12-2021(online)].pdf 2021-12-02
13 201811005824-DRAWINGS [15-02-2018(online)].pdf 2018-02-15
14 201811005824-DECLARATION OF INVENTORSHIP (FORM 5) [15-02-2018(online)]_13.pdf 2018-02-15
14 201811005824-FER.pdf 2021-10-18
14 201811005824-FORM-26 [02-12-2021(online)].pdf 2021-12-02
15 201811005824-DECLARATION OF INVENTORSHIP (FORM 5) [15-02-2018(online)].pdf 2018-02-15
15 201811005824-FER.pdf 2021-10-18
15 201811005824-FORM 18 [19-02-2019(online)].pdf 2019-02-19
16 201811005824-COMPLETE SPECIFICATION [02-02-2019(online)].pdf 2019-02-02
16 201811005824-FORM 18 [19-02-2019(online)].pdf 2019-02-19
16 abstract.jpg 2018-03-09
17 201811005824-COMPLETE SPECIFICATION [02-02-2019(online)].pdf 2019-02-02
17 201811005824-CORRESPONDENCE-OTHERS [02-02-2019(online)].pdf 2019-02-02
17 201811005824-Power of Attorney-130318.pdf 2018-03-22
18 201811005824-CORRESPONDENCE-OTHERS [02-02-2019(online)].pdf 2019-02-02
18 201811005824-DRAWING [02-02-2019(online)].pdf 2019-02-02
18 201811005824-OTHERS-130318.pdf 2018-03-22
19 201811005824-Correspondence-130318.pdf 2018-03-22
19 201811005824-DRAWING [02-02-2019(online)].pdf 2019-02-02
19 201811005824-ENDORSEMENT BY INVENTORS [02-02-2019(online)].pdf 2019-02-02
20 201811005824-ENDORSEMENT BY INVENTORS [02-02-2019(online)].pdf 2019-02-02
20 201811005824-FORM 3 [02-02-2019(online)].pdf 2019-02-02
21 201811005824-FORM 3 [02-02-2019(online)].pdf 2019-02-02
21 201811005824-ENDORSEMENT BY INVENTORS [02-02-2019(online)].pdf 2019-02-02
21 201811005824-Correspondence-130318.pdf 2018-03-22
22 201811005824-Correspondence-130318.pdf 2018-03-22
22 201811005824-DRAWING [02-02-2019(online)].pdf 2019-02-02
22 201811005824-OTHERS-130318.pdf 2018-03-22
23 201811005824-CORRESPONDENCE-OTHERS [02-02-2019(online)].pdf 2019-02-02
23 201811005824-OTHERS-130318.pdf 2018-03-22
23 201811005824-Power of Attorney-130318.pdf 2018-03-22
24 abstract.jpg 2018-03-09
24 201811005824-Power of Attorney-130318.pdf 2018-03-22
24 201811005824-COMPLETE SPECIFICATION [02-02-2019(online)].pdf 2019-02-02
25 201811005824-FORM 18 [19-02-2019(online)].pdf 2019-02-19
25 abstract.jpg 2018-03-09
25 201811005824-DECLARATION OF INVENTORSHIP (FORM 5) [15-02-2018(online)].pdf 2018-02-15
26 201811005824-DECLARATION OF INVENTORSHIP (FORM 5) [15-02-2018(online)].pdf 2018-02-15
26 201811005824-DECLARATION OF INVENTORSHIP (FORM 5) [15-02-2018(online)]_13.pdf 2018-02-15
26 201811005824-FER.pdf 2021-10-18
27 201811005824-DECLARATION OF INVENTORSHIP (FORM 5) [15-02-2018(online)]_13.pdf 2018-02-15
27 201811005824-DRAWINGS [15-02-2018(online)].pdf 2018-02-15
27 201811005824-FORM-26 [02-12-2021(online)].pdf 2021-12-02
28 201811005824-FORM 3 [02-12-2021(online)].pdf 2021-12-02
28 201811005824-DRAWINGS [15-02-2018(online)]_3.pdf 2018-02-15
28 201811005824-DRAWINGS [15-02-2018(online)].pdf 2018-02-15
29 201811005824-DRAWINGS [15-02-2018(online)]_3.pdf 2018-02-15
29 201811005824-FER_SER_REPLY [02-12-2021(online)].pdf 2021-12-02
29 201811005824-FIGURE OF ABSTRACT [15-02-2018(online)].jpg 2018-02-15
30 201811005824-FIGURE OF ABSTRACT [15-02-2018(online)].jpg 2018-02-15
30 201811005824-FIGURE OF ABSTRACT [15-02-2018(online)]_51.jpg 2018-02-15
30 201811005824-US(14)-HearingNotice-(HearingDate-01-03-2024).pdf 2024-02-05
31 201811005824-FIGURE OF ABSTRACT [15-02-2018(online)]_51.jpg 2018-02-15
31 201811005824-FORM 1 [15-02-2018(online)].pdf 2018-02-15
31 201811005824-FORM-26 [27-02-2024(online)].pdf 2024-02-27
32 201811005824-Correspondence to notify the Controller [27-02-2024(online)].pdf 2024-02-27
32 201811005824-FORM 1 [15-02-2018(online)].pdf 2018-02-15
32 201811005824-FORM 1 [15-02-2018(online)]_8.pdf 2018-02-15
33 201811005824-FORM 1 [15-02-2018(online)]_8.pdf 2018-02-15
33 201811005824-POWER OF AUTHORITY [15-02-2018(online)].pdf 2018-02-15
33 201811005824-Written submissions and relevant documents [14-03-2024(online)].pdf 2024-03-14
34 201811005824-POA [25-06-2024(online)].pdf 2024-06-25
34 201811005824-POWER OF AUTHORITY [15-02-2018(online)].pdf 2018-02-15
34 201811005824-POWER OF AUTHORITY [15-02-2018(online)]_9.pdf 2018-02-15
35 201811005824-FORM 13 [25-06-2024(online)].pdf 2024-06-25
35 201811005824-POWER OF AUTHORITY [15-02-2018(online)]_9.pdf 2018-02-15
35 201811005824-PROOF OF RIGHT [15-02-2018(online)].pdf 2018-02-15
36 201811005824-AMENDED DOCUMENTS [25-06-2024(online)].pdf 2024-06-25
36 201811005824-PROOF OF RIGHT [15-02-2018(online)].pdf 2018-02-15
36 201811005824-PROOF OF RIGHT [15-02-2018(online)]_15.pdf 2018-02-15
37 201811005824-PROVISIONAL SPECIFICATION [15-02-2018(online)].pdf 2018-02-15
37 201811005824-PROOF OF RIGHT [15-02-2018(online)]_15.pdf 2018-02-15
37 201811005824-FORM-24 [30-09-2024(online)].pdf 2024-09-30
38 201811005824-STATEMENT OF UNDERTAKING (FORM 3) [15-02-2018(online)].pdf 2018-02-15
38 201811005824-ReviewPetition-HearingNotice-(HearingDate-19-03-2025).pdf 2025-02-17
38 201811005824-PROVISIONAL SPECIFICATION [15-02-2018(online)].pdf 2018-02-15
39 201811005824-STATEMENT OF UNDERTAKING (FORM 3) [15-02-2018(online)]_43.pdf 2018-02-15
39 201811005824-STATEMENT OF UNDERTAKING (FORM 3) [15-02-2018(online)].pdf 2018-02-15
39 201811005824-Correspondence to notify the Controller [13-03-2025(online)].pdf 2025-03-13
40 201811005824-Written submissions and relevant documents [03-04-2025(online)].pdf 2025-04-03
40 201811005824-STATEMENT OF UNDERTAKING (FORM 3) [15-02-2018(online)]_43.pdf 2018-02-15

Search Strategy

1 _SearchStrategy-201811005824E_22-07-2021.pdf