Abstract: ABSTRACT System and Method for Providing Riding Assistance to Rider during Vehicle Operating Condition of Vehicle Present invention provides a system (100) and a method (400) for providing riding assistance to a rider during a vehicle operating condition of a vehicle (102). The system (100) comprises a sensing unit (104) and a steering angle sensor (106) that procure information pertaining to surroundings and steering angle information of the vehicle (102), respectively. The system (100) comprises a control unit (110) communicatively coupled to the sensing unit (104) and the steering angle sensor (106). The control unit (110) is configured to filters one or more objects corresponding to a steering orientation for obtaining one or more filtered objects and transmits an alert signal to one or more alerting devices (116) based on a distance between the vehicle (102) and the one or more filtered objects, thereby providing riding assistance to the rider during the vehicle operating condition of the vehicle (102). Reference Figure 2
Description:FORM 2
THE PATENTS ACT, 1970
(39 OF 1970)
&
THE PATENTS RULES, 2003
COMPLETE SPECIFICATION
[See section 10, Rule 13]
TITLE OF INVENTION
System and Method for Providing Riding Assistance to Rider during Vehicle Operating Condition of Vehicle
APPLICANT
TVS MOTOR COMPANY LIMITED, an Indian company, having its address at “Chaitanya”, No.12 Khader Nawaz Khan Road, Nungambakkam, Chennai 600 006.
PREAMBLE TO THE DESCRIPTION
The following specification particularly describes the invention and the manner in which it is to be performed.
FIELD OF THE INVENTION
[001] The present invention relates to a system and a method for providing riding assistance to a rider of a vehicle. More specifically, the present invention relates to a system and a method for providing riding assistance to the rider during vehicle operating condition.
BACKGROUND OF THE INVENTION
[002] In saddle type vehicles, a rider benefits when riding assistance is provided by the saddle type vehicle. The rider can comprehend the status of various components of the saddle type vehicle as well as the surrounding environment. Typically, the rider is skilled in driving the two-wheeled vehicle in a forward direction only.
[003] However, riding the saddle type vehicle in a reverse direction is a challenge for the rider, as reverse riding requires the rider to turn his head or body to observe surroundings behind the saddle type vehicle, for preventing collision with other objects and/or people that are present behind the two-wheeled vehicle. In doing so, the rider is unaware of the distance between the saddle type vehicle and the object and/or the people behind the saddle type vehicle. Additionally, saddle type vehicles are generally provided with a reverse mode that enables the rider to move the saddle type vehicle in a reverse direction using a propulsion system. In such a case, the saddle type vehicle may move at a higher speed in the reverse direction. In any case, the vehicle may collide with the object and/or the people that are present behind the saddle type vehicle. Such a scenario may lead to accidents, which is undesirable. Moreover, the rider is required to turn his head or body during the reverse vehicle operation of the saddle type vehicle, which is non-ergonomic and makes the rider uncomfortable, deteriorating riding experience of the rider in the two-wheeled vehicle.
[004] Thus, there is a need for a system and a method for providing riding assistance to the rider during a reverse vehicle operation of the vehicle, which addresses at least one or more aforementioned problems.
SUMMARY OF THE INVENTION
[005] In one aspect of the invention, a system for providing riding assistance to a rider during a vehicle operating condition of a vehicle is disclosed. The system comprises a sensing unit mounted on the vehicle. The sensing unit is configured to procure information pertaining to surroundings of the vehicle. The system comprises a steering angle sensor coupled to a steering unit of the vehicle. The steering angle sensor is configured to procure steering angle information of the steering unit. The system comprises a control unit communicatively coupled to the sensing unit and the steering angle sensor. The control unit is configured to filter one or more objects corresponding to a steering orientation for obtaining one or more filtered objects. The control unit is configured to transmit an alert signal to one or more alerting devices based on a distance between the vehicle and the one or more filtered objects, thereby providing riding assistance to the rider during the vehicle operating condition of the vehicle.
[006] In an embodiment, the sensing unit comprises at least one of a range detection and ranging (RADAR) unit and an image sensor. The RADAR unit is configured to generate radar information pertaining to the surroundings of the vehicle. The image sensor is configured to generate image information pertaining to the surroundings of the vehicle.
[007] In an embodiment, the control unit is configured to receive the information pertaining to the surroundings of the vehicle from the sensing unit and the steering angle information from the steering angle sensor. The control unit is configured to determine the one or more objects based on the information pertaining to the surroundings of the vehicle. The control unit is configured to determine the steering orientation of the vehicle based on the steering angle information.
[008] In an embodiment, the control unit is configured to determine one or more left-sided objects from the one or more objects, when the determined steering orientation is in a left direction of the vehicle. The control unit is configured to determine one or more central objects from the one or more objects, when the determined steering orientation is in a substantially central direction of the vehicle. The control unit is configured to determine one or more right-sided objects from the one or more objects, when the determined steering orientation is in a right direction of the vehicle.
[009] In an embodiment, the control unit is adapted to estimate the distance between the vehicle and the one or more filtered objects. The control unit is adapted to transmit the alert signal with a first intensity when the distance between the vehicle and the one or more filtered objects is below a first threshold value.
[010] In an embodiment, the control unit is adapted to transmit the alert signal with a second intensity when the distance between the vehicle and the one or more filtered objects is below a second threshold value. The second threshold value is below the first threshold value.
[011] In an embodiment, the one or more alerting devices are configured to alert the rider through one of a visual alert, an audible alert, and a haptic alert when the distance between the vehicle and the one or more filtered objects is below one of the first threshold value and the second threshold value.
[012] In an embodiment, the control unit is configured to provide a collision alert to the one or more alerting devices and a collision signal to at least one of a powertrain and a brake unit of the vehicle when the distance between the vehicle and the one or more filtered objects is below a critical value, wherein the critical value is lower than the first threshold value and the second threshold value.
[013] In an embodiment, the one or more alerting devices are configured to alert the rider through one of a visual alert, an audible alert, and a haptic alert upon receiving the collision alert. The at least one of the powertrain and the brake unit are operated to stop the vehicle upon receiving the collision signal.
[014] In an embodiment, the one or more alerting devices comprises at least one of a display device, an audible device, and a haptic device.
[015] In an embodiment, the control unit is adapted to transmit the alert signal to the one or more alerting devices based on the distance between a rear portion of the vehicle and the one or more filtered objects.
[016] In an embodiment, the vehicle operating condition corresponds to a reverse riding condition of the vehicle.
[017] In another aspect of the invention, a method for providing riding assistance to a rider during a vehicle operating condition of a vehicle is disclosed. The method comprises procuring, by a sensing unit, information pertaining to surroundings of the vehicle. The method comprises procuring, by a steering angle sensor, steering angle information of a steering unit. The method comprises filtering, by a control unit, one or more objects corresponding to a steering orientation for obtaining one or more filtered objects. The method comprises transmitting, by the control unit, an alert signal to one or more alerting devices based on a distance between the vehicle and the one or more filtered objects, thereby providing riding assistance to the rider during the vehicle operating condition of the vehicle.
BRIEF DESCRIPTION OF THE DRAWINGS
[018] Reference will be made to embodiments of the invention, examples of which may be illustrated in accompanying figures. These figures are intended to be illustrative, not limiting. Although the invention is generally described in context of these embodiments, it should be understood that it is not intended to limit the scope of the invention to these particular embodiments.
Figure 1 is a top perspective view of a vehicle, in accordance with an embodiment of the present invention.
Figure 2 is a block diagram illustrating a system for providing riding assistance to a rider during a vehicle operating condition of the vehicle, in accordance with an embodiment of the present invention.
Figure 3 is a flow chart illustrating a method for providing riding assistance to rider during a vehicle operating condition of a vehicle, in accordance with an embodiment of the present invention.
Figure 4 is a flow chart illustrating the method for providing riding assistance to rider during a vehicle operating condition of a vehicle, in accordance with another embodiment of the present invention.
DETAILED DESCRIPTION OF THE INVENTION
[019] The present invention relates to a system and a method for providing riding assistance to a rider during a vehicle operating condition of a vehicle. The system in the present invention is adapted to filter one or more objects based on information pertaining to surroundings of a vehicle and a steering orientation of the vehicle. The system is further adapted to transmit an alert signal based on distance between the vehicle and the one or more filtered objects, thereby providing riding assistance during the vehicle riding operation of the vehicle.
[020] Figure 1 is a perspective view of a vehicle 102 in accordance with an embodiment of the present invention. The vehicle 102 comprises a system 100 for providing riding assistance to a rider (not shown) during a vehicle operating condition of the vehicle 102. In an embodiment, the term “riding assistance” refers to assistance provided by the system 100 to the rider for informing the rider about surroundings around the rider.
[021] Figure 2 is a block diagram illustrating the system 100 for providing riding assistance to the rider during the vehicle operating condition of the vehicle 102, in accordance with an embodiment of the present invention. In an embodiment, the vehicle operating condition corresponds to a reverse riding condition of the vehicle (102). The system 100 comprises a sensing unit 104 and a steering angle sensor 106. The sensing unit 104 is configured to procure information pertaining to surroundings of the vehicle 102. The sensing unit 104 is mounted on the vehicle 102. In one embodiment, the sensing unit 104 is viewing rearwardly from the vehicle 102. The sensing unit 104 comprises at least one of a range detection and ranging (RADAR) unit 112 and an image sensor 114. The RADAR unit 112 is configured to generate radar information pertaining to the surroundingsof the vehicle 102. The image sensor 114 is configured to generate image information pertaining to the surroundingsof the vehicle 102. In one embodiment, the radar information and the image information pertain to a rearward view of the vehicle 102.
[022] The steering angle sensor 106 is coupled to a steering unit (not shown) of the vehicle 102. In an embodiment, the steering unit may be a handlebar (not shown) of the vehicle 102. The steering angle sensor 106 is configured to procure steering orientation of the steering unit. The steering orientation is indicative of a direction and/or a rotation angle of the steering unit, with respect to a central axis (not shown) of the steering unit. In one embodiment, the steering angle sensor 106 may be one of a potentiometer, an inertial motion unit sensor, a gyroscope, and the like.
[023] The system 100 comprises a control unit 110 that is coupled wirelessly or by wire with the sensing unit 104 and the steering angle sensor 106. In one embodiment, the control unit 110 is coupled to the sensing unit 104 and the steering angle sensor 106 using one of a universal serial bus, a Gigabit Multimedia Serial Link, wireless fidelity (Wi-Fi), Bluetooth, and the like. The control unit 110 is configured to receive the information pertaining to the surroundings of the vehicle 10 and the steering orientation from the steering angle sensor 106.
[024] The system 100 further comprises an instrument cluster 108 and one or more alerting devices 116 that are communicatively coupled to the control unit 110. The system 100 also comprises a speed sensor (not shown) disposed in the vehicle 102. The speed sensor is configured to procure speed of movement of the vehicle 102. The control unit 110 receives a speed signal indicative of the speed of movement of the vehicle 102 from the speed sensor. The control unit 110 determines reverse movement of the vehicle 102 based on the speed signal and activates the vehicle operating condition of the vehicle 102.
[025] The instrument cluster 108 comprises a display unit (not shown) that is coupled wirelessly or by wire to the control unit 110. In one embodiment, the instrument cluster 108 and the control unit 110 are coupled via Wi-Fi. In an embodiment, the control unit 110 is configured to transmit the radar information or image information to the display unit for display to the rider. The one or more alerting devices 116 comprises at least one of a display device, an audible device, and a haptic device located on at least one of a rider seat, a handlebar, and a foot peddle, and the like.
[026] Referring to method 300 of Figure 3 in accordance with Figure 2, when the vehicle 102 is turned ON at step 302, the control unit 110 checks whether the vehicle 102 is moving in a reverse direction or a forward direction based on the speed signal received from the speed sensor at step 304. Thereafter, upon receipt of the direction signal, the control unit 110 activates the reverse riding condition. In an embodiment, when the vehicle 102 is an electric vehicle, the electric vehicle has a transmission that is used for reverse riding and can be activated by the rider.
[027] Upon activation of the reverse operating condition, the control unit 110 is configured to operate the sensing unit 104 at step 306. The sensing unit 104 generates at least one of the radar information and the image information pertaining to the surroundings of the vehicle 102. The control unit 110 is configured to determine one or more objects based on the information pertaining to the surroundings of the vehicle 102 at step 308. The one or more objects includes vehicles, people, walls, and the like.
[028] Further, the control unit 110 is configured to determine a steering orientation based on the steering angle information at step 310. In one embodiment, the steering orientation may range between 0 to 180 degrees. In another embodiment, the steering orientation may range from a left direction (step 312) to a central direction (step 314) to a right direction (step 316) of the vehicle 102. The control unit 110 is configured to filter the one or more objects corresponding to the determined steering orientation for obtaining one or more filtered objects. Filtering of the one or more objects corresponds to identifying a set of objects visible in a portion of the surroundings wherein the portion of the surroundings is determined based on the steering orientation of the vehicle 102. In other words, filtering of the one or more objects corresponds to masking objects that are not required to be viewed by the rider during reverse riding, based on the steering orientation of the vehicle. The identified set of objects lay behind the vehicle 102 on the reverse path where a collision of the identified set of objects with the vehicle 102 may occur. In an embodiment, the control unit 110 is configured to use image processing techniques, neural networks, machine learning, and the like to filter the one or more objects.
[029] In an embodiment, the control unit 110 is configured to determine one or more left-sided objects from the one or more objects, when the determined steering orientation is in the left direction of the vehicle 102. The control unit 110 is configured to determine one or more central objects from the one or more objects, when the determined steering orientation is in a substantially central direction of the vehicle 102. The control unit 110 is configured to determine one or more right-sided objects from the one or more objects, when the determined steering orientation is in the right direction of the vehicle 102.
[030] In an exemplary embodiment, if the steering orientation is towards the left side and six objects were determined by the sensing unit 104 at the rear portion of the vehicle 102, the control unit 110 is adapted to display only the objects that are on the left side of the vehicle 102 while masking the other objects. Thus, if out of six objects four objects are at center and at right sides of the vehicle 102, the control unit 110 masks these four objects and display only the two objects that are in the left side of the rear portion of the vehicle 102, thereby assisting the rider during reverse riding operation of the vehicle 102.
[031] In an exemplary embodiment, if the steering orientation is towards the right side and six objects were determined by the sensing unit 104 at the rear portion of the vehicle 102, the control unit 110 is adapted to display only the objects that are on the right side of the vehicle 102 while masking the other objects. Thus, if out of six objects four objects are at center and at left sides of the vehicle 102, the control unit 110 masks these four objects and display only the two objects that are in the right side of the rear portion of the vehicle 102, thereby assisting the rider during reverse riding operation of the vehicle 102.
[032] Upon determining the filtered one or more objects, the control unit 110 is configured to estimate a distance between the vehicle 102 and the one or more filtered objects at step 318. In an embodiment, the estimated distance is between a rear portion of the vehicle 102 and the one or more filtered objects at step 318. The control unit 110 is further configured to transmit an alert signal to the one or more alerting devices 116 based on the estimated distance, thereby providing riding assistance to the rider during the reverse riding operation of the vehicle 102. The term, “rear portion” of the vehicle 102 refers to one of a tail lamp, a number plate, a rear wheel, and the like.
[033] The control unit 110 is adapted to transmit the alert signal with a first intensity when the distance between the vehicle 102 and the one or more filtered objects is below a first threshold value at step 320. In one embodiment, the first threshold value is 100 centimeters. In an embodiment, the alert signal is transmitted to the one or more alerting devices 116 when the distance between the vehicle 102 and the one or more filtered objects is 80 centimeters.
[034] The control unit 110 is adapted to transmit the alert signal with a second intensity when the distance between the vehicle 102 and the one or more filtered objects is below a second threshold value at step 320. The second threshold value is below the first threshold value. The second intensity is higher than the first intensity. In one embodiment, the second threshold value is 50 centimeters. In an embodiment, the alert signal is transmitted to the one or more alerting devices 116 when the distance between the vehicle 102 and the one or more filtered objects is 40 centimeters.
[035] In an embodiment, the intensity of the alert signal refers to at least one of loudness, brightness, and vibration level of the alert provided by the one or more alerting devices 116. As such, intensity of the alert provided by the one or more alerting devices 116 upon receiving the alert signal with the second intensity is greater than the intensity of the alert provided by the one or more alerting devices 116 upon receiving the alert signal with the first intensity.
[036] The one or more alerting devices 116 are configured to alert the rider through one of a visual alert, an audible alert, and a haptic alert upon receiving the alert signal at step 324.
[037] When the estimated distance between the rear portion of the vehicle 102 and the one or more filtered objects is below a critical value, wherein the critical value is lower than the first threshold value and the second threshold value, the control unit 110 is configured to provide a collision alert at step 322. The term, “collision alert” refers to an alert raised for the rider of the vehicle 102 when the vehicle 102 is in proximity of an obstacle along the travel path, within a distance below the critical value during the reverse operation condition. In one embodiment, the critical value is 30 centimeters. In an embodiment, the control unit 110 is configured to provide the collision alert to the one or more alerting devices 116 and a collision signal to at least one of a powertrain (step 326) and a brake unit (step 328) of the vehicle 102 when the distance between the vehicle 102 and the one or more filtered objects is 20 centimeters.
[038] The one or more alerting devices 116 are configured to alert the rider through one of a visual alert, an audible alert, and a haptic alert upon receiving the collision alert at step 324. Upon receiving the collision signal, the at least one of the powertrain and the brake unit are operated to stop the vehicle 102 upon receiving the collision signal. As a result, the rider is assisted by the system 100 to avoid collision with the one or more filtered objects during the reverse operating condition of the vehicle 102.
[039] In one embodiment, when the rider steers the vehicle 102 towards left in the reverse operating condition, the vehicle 102 is bound to move in a left reverse direction. Accordingly, the system 100 provides reverse riding assistance by avoiding the one or more left-sided objects from the one or more objects. In another embodiment, when the rider steers the vehicle 102 towards center in the reverse operating condition, the vehicle 102 is bound to move in a central reverse direction. Accordingly, the system 100 provides reverse riding assistance by avoiding the one or more central objects from the one or more objects. In yet another embodiment, when the rider steers the vehicle 102 towards right in the reverse operating condition, the vehicle 102 is bound to move in a right reverse direction. Accordingly, the system 100 provides reverse riding assistance by avoiding the one or more right-sided objects from the one or more objects.
[040] In an embodiment, the control unit 110 is embodied as a multi-core processor, a single core processor, or a combination of one or more multi-core processors and one or more single core processors. For example, the control unit 110 is embodied as one or more of various processing devices, such as a coprocessor, a microprocessor, a controller, a digital signal processor (DSP), a processing circuitry with or without an accompanying DSP, or various other processing devices including integrated circuits such as, for example, an application specific integrated circuit (ASIC), a field programmable gate array (FPGA), a microcontroller unit (MCU), a hardware accelerator, a special-purpose computer chip, or the like. In another embodiment, the control unit 110 is configured to execute hard-coded functionality.
[041] The control unit 110 comprises a storage unit (not shown). The storage unit of the control unit 110 may include a memory. The memory may be a main memory, a static memory, or a dynamic memory. The memory may include but is not limited to computer readable storage media such as various types of volatile and non-volatile storage media, including but not limited to random access memory, read-only memory, programmable read-only memory, electrically programmable read-only memory, electrically erasable read-only memory, flash memory, magnetic tape or disk, optical media and the like. The memory is operable to store instructions executable by the processor. The functions, acts or tasks illustrated in the figures or described may be performed by the programmed processor executing the instructions stored in the memory.
[042] Figure 4 is a flow chart illustrating a method 400 for providing riding assistance to the rider during the vehicle operating condition of the vehicle 102, in accordance with another embodiment of the present invention.
[043] At step 402, the sensing unit 104 procures information pertaining to surroundings of the vehicle 102, At step 404, the steering angle sensor 106 procures steering angle information of a steering unit of the vehicle 102. The control unit 110 receives information pertaining to the surroundings of the vehicle 102 from the sensing unit 104 mounted on the vehicle 102 and the steering angle information from a steering angle sensor 106 coupled to the steering unit of the vehicle 102.
[044] Thereafter, the control unit 110 determines the one or more objects based on the information pertaining to the rearward view of the vehicle 102.
[045] The control unit 110 determines the steering orientation of the vehicle 102 based on the steering angle information. At step 406, the control unit 110 filters the one or more corresponding objects to the determined steering orientation for obtaining one or more filtered objects, as already described in description pertaining to Figures 2 and 3.
[046] At step 408, the control unit 110 transmits the alert signal to the one or more alerting devices 116 based on the distance between the vehicle 102 and the one or more filtered objects. In an embodiment, the estimated distance is between the rear portion of the vehicle 102 and the one or more filtered objects. The alert signal is transmitted with the first intensity when the distance between the vehicle 102 and the one or more obstacles is below the first threshold value. The alert signal is transmitted with the second intensity when the distance between the vehicle 102 and the one or more obstacles is below the second threshold value. Alternatively, if the estimated distance is below the critical value, wherein the critical value is below the first threshold value and the second threshold value, then the control unit 110 provides the collision alert to the one or more alerting devices 116 and the collision signal to at least one of a powertrain and a brake unit of the vehicle 102.
[047] The one or more devices 116 are adapted to alert the rider thereby providing riding assistance during the vehicle operating condition of the vehicle 102. Subsequently, the rider is alerted by one of the visual alert, the audible alert and the haptic alert, when the distance between the vehicle 102 and the one or more obstacles is below one of the first threshold value, the second threshold value and the critical value. In an embodiment, the rider is alerted by one of the visual alert, the audible alert and the haptic alert, when the distance between the rear portion of the vehicle 102 and the one or more obstacles is below one of the first threshold value, the second threshold value and the critical value. The collision signal is indicative of operating at least one of the powertrain and the brake unit for stopping the vehicle 102.
[048] The claimed invention as disclosed above is not routine, conventional or well understood in the art, as the claimed aspects enable the following solutions to the existing problems in conventional technologies. Specifically, the claimed aspect of determining a steering orientation of the vehicle based on the steering angle information and filtering one or more objects corresponding to the determined steering orientation for obtaining one or more filtered objects. Further, the system alerts the rider based on an estimated distance between the vehicle and the one or more filtered objects when the reverse riding operation is activated. As a result, the rider can avoid collisions based on the alert signal and the collision signal from the one or more alerting devices, the powertrain, and the brake unit thereby ensuring rider safety.
[049] In light of the abovementioned advantages and the technical advancements provided by the disclosed system and method, the claimed steps as discussed above are not routine, conventional, or well understood in the art, as the claimed steps enable the following solutions to the existing problems in conventional technologies. Further, the claimed steps clearly bring an improvement in the functioning of the system itself as the claimed steps provide a technical solution to a technical problem.
[050] Furthermore, one or more computer-readable storage media may be utilized in implementing embodiments consistent with the present disclosure. A computer-readable storage medium refers to any type of physical memory on which information or data readable by a processor may be stored. Thus, a computer-readable storage medium may store instructions for execution by one or more processors, including instructions for causing the processor(s) to perform steps or stages consistent with the embodiments described herein. The term “computer-readable storage medium” should be understood to include tangible items and exclude carrier waves and transient signals, i.e., be non-transitory. Examples include random access memory (RAM), read-only memory (ROM), volatile memory, non-volatile memory, hard drives, CD ROMs, DVDs, flash drives, disks, and any other known physical storage media”.
[051] While the present invention has been described with respect to certain embodiments, it will be apparent to those skilled in the art that various changes and modification may be made without departing from the scope of the invention as defined in the following claims.
List of Reference Numerals
100 – System
102 – Vehicle
104 – Sensing unit
106 – Steering angle sensor
108 – Instrument cluster
110 – Control unit
112 – RADAR unit
114 – Image sensor
116 – One or more Alerting Devices
, Claims:WE CLAIM:
1. A system (100) for providing riding assistance to a rider during a vehicle operating condition of a vehicle (102), the system (100) comprising:
a sensing unit (104), the sensing unit (104) being mounted on the vehicle (102), the sensing unit (104) being configured to procure information pertaining to surroundings of the vehicle (102);
a steering angle sensor (106) coupled to a steering unit of the vehicle (102), the steering angle sensor (106) being configured to procure steering angle information of the steering unit; and
a control unit (110) communicatively coupled to the sensing unit (104) and the steering angle sensor (106), the control unit (110) being configured to:
filter, one or more objects corresponding to a steering orientation for obtaining one or more filtered objects; and
transmit, an alert signal to one or more alerting devices (116) based on a distance between the vehicle (102) and the one or more filtered objects, thereby providing riding assistance to the rider during the vehicle operating condition of the vehicle (102).
2. The system (100) as claimed in claim 1, wherein the sensing unit (104) comprises at least one of a range detection and ranging (RADAR) unit (112) and an image sensor (114),
the RADAR unit (112) being configured to generate radar information pertaining to the surroundings of the vehicle (102); and
the image sensor (114) being configured to generate image information pertaining to the surroundings of the vehicle (102).
3. The system (100) as claimed in claim 1, wherein the control unit (110) being configured to:
receive, the information pertaining to the surroundings of the vehicle (102) from the sensing unit (104) and the steering angle information from the steering angle sensor (106);
determine, the one or more objects based on the information pertaining to the surroundings of the vehicle (102); and
determine, the steering orientation of the vehicle (102) based on the steering angle information.
4. The system (100) as claimed in claim 3, wherein control unit (110) being configured to:
determine one or more left-sided objects from the one or more objects, when the determined steering orientation is in a left direction of the vehicle (102);
determine one or more central objects from the one or more objects, when the determined steering orientation is in a substantially central direction of the vehicle (102); and
determine one or more right-sided objects from the one or more objects, when the determined steering orientation is in a right direction of the vehicle (102).
5. The system (100) as claimed in claim 3, wherein the control unit (110) being adapted to estimate the distance between the vehicle (102) and the one or more filtered objects, the control unit (110) being adapted to transmit the alert signal with a first intensity when the distance between the vehicle (102) and the one or more filtered objects is below a first threshold value.
6. The system (100) as claimed in claim 5, wherein the control unit (110) being adapted to transmit the alert signal with a second intensity when the distance between the vehicle (102) and the one or more filtered objects is below a second threshold value, wherein the second threshold value is below the first threshold value.
7. The system (100) as claimed in claim 6, wherein the one or more alerting devices (116) being configured to alert the rider through one of a visual alert, an audible alert, and a haptic alert when the distance between the vehicle (102) and the one or more filtered objects is below one of the first threshold value and the second threshold value.
8. The system (100) as claimed in claim 7, wherein the control unit (110) being configured to provide a collision alert to the one or more alerting devices (116) and a collision signal to at least one of a powertrain and a brake unit of the vehicle (102), when the distance between the vehicle (102) and the one or more filtered objects is below a critical value, wherein the critical value is lower than the first threshold value and the second threshold value.
9. The system (100) as claimed in claim 8, wherein
the one or more alerting devices (116) being configured to alert the rider through one of a visual alert, an audible alert, and a haptic alert upon receiving the collision alert; and
the at least one of the powertrain and the brake unit being operated to stop the vehicle (102), upon receiving the collision signal.
10. The system (100) as claimed in claim 1, wherein the one or more alerting devices (116) comprises at least one of a display device, an audible device, and a haptic device.
11. The system (100) as claimed in claim 1, wherein the control unit (110) being adapted to transmit the alert signal to the one or more alerting devices (116) based on the distance between a rear portion of the vehicle (102) and the one or more filtered objects.
12. The system (100) as claimed in claim 1, wherein the vehicle operating condition corresponds to a reverse riding condition of the vehicle (102).
13. A method (400) for providing riding assistance to a rider during a vehicle operating condition of a vehicle (102), the method (400) comprising:
procuring (402), by a sensing unit (104), information pertaining to surroundings of the vehicle (102);
procuring (404), by a steering angle sensor (106), steering angle information of a steering unit;
filtering (406), by a control unit (110) coupled to the sensing unit (104) and the steering angle sensor (106), one or more objects corresponding to a steering orientation for obtaining one or more filtered objects; and
transmitting (408), by the control unit (110), an alert signal to one or more alerting devices (116) based on a distance between the vehicle (102) and the one or more filtered objects, thereby providing riding assistance to the rider during the vehicle operating condition of the vehicle (102).
14. The method (400) as claimed in claim 13 comprising at least one of:
generating, by a range detection and ranging unit (RADAR) (112) of the sensing unit (104), radar information pertaining to the surroundings of the vehicle (102); and
generating, by an image sensor (114) of the sensing unit (104), image information pertaining to the surroundings of the vehicle (102).
15. The method (400) as claimed in claim 13 comprising:
receiving, by the control unit (110), the information pertaining to the surroundings of the vehicle (102) from the sensing unit (104) and the steering angle information from the steering angle sensor (106), the sensing unit (104) being mounted on the vehicle (102), and the steering angle sensor (106) being coupled to the steering unit of the vehicle (102);
determining, by the control unit (110), the one or more objects based on the information pertaining to the surroundings of the vehicle (102); and
determining, by the control unit (110), the steering orientation of the vehicle (102) based on the steering angle information.
16. The method (400) as claimed in claim 15 comprising:
determining, by the control unit (110), one or more left-sided objects when the determined steering orientation is in a left direction of the vehicle (102);
determining, by the control unit (110), one or more central objects when the determined steering orientation is in a substantially central direction of the vehicle (102); and
determining, by the control unit (110), one or more right-sided objects when the determined steering orientation is in a right direction of the vehicle (102).
17. The method (400) as claimed in claim 15 comprising:
estimating, by the control unit (102), the distance between the vehicle (102) and the one or more filtered objects; and
transmitting, by the control unit (102), the alert signal with a first intensity when the distance between the vehicle (102) and the one or more filtered objects is below a first threshold value.
18. The method (400) as claimed in claim 17 comprising transmitting, by the control unit (102), the alert signal with a second intensity when the distance between the vehicle (102) and the one or more filtered objects is below a second threshold value, wherein the second threshold value is below the first threshold value.
19. The method (400) as claimed in claim 18 comprising alerting, by the one or more alerting devices (116), the rider through one of a visual alert, an audible alert, and a haptic alert when the distance between the vehicle (102) and the one or more filtered objects is below one of the first threshold value and the second threshold value.
20. The method (400) as claimed in claim 19 comprising providing, by the control unit (110), a collision alert to the one or more alerting devices (116) and a collision signal to at least one of a powertrain and a brake unit of the vehicle (102), when the distance between the vehicle (102) and the one or more filtered objects is below a critical value, wherein the critical value is lower than the first threshold value and the second threshold value.
21. The method (400) as claimed in claim 20 comprising:
alerting, by the the one or more alerting devices (116), the rider through one of a visual alert, an audible alert, and a haptic alert upon receiving the collision alert; and
operating, at least one of the powertrain and the brake unit, for stopping the vehicle (102) upon receiving the collision signal.
22. The method (400) as claimed in claim 13 comprising transmitting, by the control unit (110), the alert signal to the one or more alerting devices (116) based on the distance between a rear portion of the vehicle (102) and the one or more filtered objects.
Dated this 17th day of August 2023
TVS MOTOR COMPANY LIMITED
By their Agent & Attorney
(Nikhil Ranjan)
of Khaitan & Co
Reg No IN/PA-1471
| # | Name | Date |
|---|---|---|
| 1 | 202341055301-STATEMENT OF UNDERTAKING (FORM 3) [17-08-2023(online)].pdf | 2023-08-17 |
| 2 | 202341055301-REQUEST FOR EXAMINATION (FORM-18) [17-08-2023(online)].pdf | 2023-08-17 |
| 3 | 202341055301-PROOF OF RIGHT [17-08-2023(online)].pdf | 2023-08-17 |
| 4 | 202341055301-POWER OF AUTHORITY [17-08-2023(online)].pdf | 2023-08-17 |
| 5 | 202341055301-FORM 18 [17-08-2023(online)].pdf | 2023-08-17 |
| 6 | 202341055301-FORM 1 [17-08-2023(online)].pdf | 2023-08-17 |
| 7 | 202341055301-FIGURE OF ABSTRACT [17-08-2023(online)].pdf | 2023-08-17 |
| 8 | 202341055301-DRAWINGS [17-08-2023(online)].pdf | 2023-08-17 |
| 9 | 202341055301-DECLARATION OF INVENTORSHIP (FORM 5) [17-08-2023(online)].pdf | 2023-08-17 |
| 10 | 202341055301-COMPLETE SPECIFICATION [17-08-2023(online)].pdf | 2023-08-17 |