Sign In to Follow Application
View All Documents & Correspondence

A Vehicle Control Unit

Abstract: The present disclosure provides method and vehicle control unit for controlling vehicle. The method comprises receiving real-time data from sensors. 5 The method comprises generating in real-time environmental map data based on data received from sensors. The method comprises computing travel route for travelling from origin point and destination point based on real-time environmental map data. The method comprises predicting probability of lane change being performed by rider, at a particular time instant based on a 10 combination of current GPS location of vehicle, traffic data and mapped environment data using binary prediction model. The method comprises identifying location of blind spots for rider based on data from sensors. The method comprises providing alerts to rider while travelling on computed travel route based on predicted probability of lane change and identified location of the 15 blind spots.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
31 March 2022
Publication Number
40/2023
Publication Type
INA
Invention Field
MECHANICAL ENGINEERING
Status
Email
Parent Application

Applicants

TVS Motor Company Limited
Jayalakshmi Estate, No 29 (Old No 8), Haddows Road
TVS Motor Company Limited
TVS Motor Company Limited, “Chaitanya”, No.12 Khader Nawaz Khan Road, Nungambakkam, Chennai 600 006

Inventors

1. CHAITANYA RAJENDRA ZANPURE
TVS Motor Company Limited, “Chaitanya”, No.12 Khader Nawaz Khan Road, Nungambakkam, Chennai 600 006
2. PRATYUSH ABHAY
TVS Motor Company Limited, “Chaitanya”, No.12 Khader Nawaz Khan Road, Nungambakkam, Chennai 600 006
3. DATTA RAJARAM SAGARE
TVS Motor Company Limited, “Chaitanya”, No.12 Khader Nawaz Khan Road, Nungambakkam, Chennai 600 006

Specification

DESC:TECHNICAL FIELD
[0001] The present subject matter generally relates to a vehicle control unit,
5 and more particularly, relates to a single centralized vehicle control unit capable of performing multiple functions to achieve advanced drive assistance for a rider.
BACKGROUND
[0002] In conventional vehicles, multiple electronic control units (ECUs)
10 are utilized to control functions of each unit. Such plurality of ECUs performs several functions in the conventional vehicles such as blind spot detection, lane change assist and vehicle speed change determination. However, for every such capability, a separate ECU is required to achieve the functionality. Typically, the vehicles also have a vehicle control unit (VCU) that can provide torque
15 coordination, operation and gearshift strategies, high-voltage and 48V coordination, charging control, on board diagnosis, monitoring, thermal management and much more for electrified and connected powertrains. The VCU can be used in electrified passenger cars, trucks, and Off-Highway vehicles but also in combustion engines applications. The VCU also ensures
20 fail-operational function for highly automated driving (HAD) solutions. Other than these drive-related functions, higher-level versions also support interconnected functions like predictive and automated longitudinal guidance, Advanced Driver Assistance System (ADAS) connection and body controller functions.
25 [0003] Multiple ECUs with have more wired networks which will complicate the overall system. Multiple ECUs can prevent enabling of ARAS functionalities in a vehicle. This ARAS functionality is very critical technology which deals in Hard Real time and implements predictive decisions in case of irregularities during driving and engine/drive train health.

[0004] To perform such additional vehicle health diagnostics, three or four different electronic control units relate to the vehicle to assist the driver, to perform motor control, switch control, lamp control, battery management system etc. However, addition of three or more electronic control units requires
5 different logic to be embedded and becomes a costly affair. Additionally, in such a configuration, the existing vehicle electronic control unit must be reconfigured to align with three or more different electronic control units and thereby leads to complexity of design, usage of higher quantities of wiring harness and lack of usable space in the vehicle.
10 [0005] Additionally, assembly of three or more individual electronic control units would not only increase the overall assembly time of the vehicle but also will increase the servicing time and overall overhauling time. Moreover, if the logic codes of the three or more electronic control units performing additional vehicle diagnostic, are embedded into the existing
15 vehicle control unit, then it increases the logic complexity of the electronic control unit.
[0006] Additionally, each vehicle electronic control unit is embedded with different logic codes to give specific vehicle performance. However, if the additional vehicle diagnostic logic is also embedded into the existing vehicle
20 control unit, then it must be customized with respect to different vehicles. Hence, such a configuration is expensive and a time-consuming process. Moreover, as each of the ECU’s are mounted and housed separately hence accessing each of the ECU’s is difficult and hence time taken for upgrading is high.
25 [0007] Further, there is a limitation to which upgradation can be done as the ECUs have limited compute power. Thus, if a customer wants to upgrade the vehicle for new functionalities, then it is not possible to upgrade the vehicle to the maximum potential and the time required for upgrading is high. There are several drawbacks associated with multiple control units such as more
30 complexity of design, usage of higher quantities of wiring harness and lack of

usable space in the vehicle, difficult to upgrade, addition of three or more electronic control units requires different logic to be embedded and becomes a costly affair, increase overall assembly time, servicing time, upgrading time and overall overhauling time.
5 [0008] Thus, there arises a need for a single centralized vehicle control unit to perform motor control functions, vehicle body control functions, control vehicle functionalities, and ARAS functions making diagnostics and upgradability easier and that further overcomes the other disadvantages as mentioned previously.
10 SUMMARY

[0009] The foregoing summary is illustrative only and is not intended to be in any way limiting. In addition to the illustrative aspects, embodiments, and features described above, further aspects, embodiments, and features will become apparent by reference to the drawings and the following detailed
15 description.

[00010] According to embodiments illustrated herein, the present disclosure provides a method for controlling a vehicle. The method being performed by a vehicle control unit. The method comprises receiving, by a vehicle control unit, real-time data from a plurality of sensors. The method comprises generating in
20 real-time, by the vehicle control unit, environmental map data based on the data received from the plurality of sensors. The method comprises computing, by the vehicle control unit, a travel route for travelling from an origin point and a destination point based on the real-time environmental map data. In an embodiment, the origin point, and the destination point are received from a rider
25 of the vehicle. The method comprises predicting, by the vehicle control unit, a probability of a lane change being performed by the rider, while travelling from the origin point to the destination point on the computed travel route, at a particular time instant based on a combination of a current GPS location of the vehicle, traffic data and the mapped environment data using a binary prediction
30 model. The method comprises identifying, by the vehicle control unit, a location

of one or more blind spots for the rider based on the data from the plurality of sensors while the rider is travelling on the computed travel route. In an embodiment, the location is at least one of a LH side, an RH side, a front side and a rear side of the vehicle. The method comprises providing, by the vehicle
5 control unit, one or more alerts to the rider while travelling on the computed travel route based on the predicted probability of the lane change and the identified location of the one or more blind spots.
[00011] According to embodiments illustrated herein, the present disclosure provides a vehicle control unit for a vehicle. The vehicle control unit comprises
10 a processor; and a computer-readable medium communicatively coupled to the processor, and the computer-readable medium stores processor-executable instructions, which when executed by the processor, cause the processor to receive real-time data from a plurality of sensors. The processor is further configured to generate in real-time environmental map data based on the data
15 received from the plurality of sensors. The processor is further configured to compute a travel route for travelling from an origin point and a destination point based on the real-time environmental map data. In an embodiment, the origin point, and the destination point are received from a rider of the vehicle. The processor is further configured to predict a probability of a lane change being
20 performed by the rider, while travelling from the origin point to the destination point on the computed travel route, at a particular time instant based on a combination of a current GPS location of the vehicle, traffic data and the mapped environment data using a binary prediction model. The processor is further configured to identify a location of one or more blind spots for the rider
25 based on the data from the plurality of sensors while the rider is travelling on the computed travel route. In an embodiment, the location is at least one of a LH side, an RH side, a front side and a rear side of the vehicle. The processor is further configured to provide one or more alerts to the rider while travelling on the computed travel route based on the predicted probability of the lane
30 change and the identified location of the one or more blind spots.

BRIEF DESCRIPTION OF DRAWINGS

[00012] The present invention will become more fully understood from the detailed description given herein below and the accompanying drawings which
5 are given by way of illustration only, and thus are not limitative of the present invention.
[00013] The detailed description is described with reference to the accompanying figures, which is related to a movable load-carrying device which is a two-wheeled saddle type vehicle being one embodiment of the
10 present subject matter. However, the present subject matter is not limited to the depicted embodiment(s). In the figures, the same or similar numbers are used throughout to reference features and components.
[00014] Fig. 1 illustrate a left side view of a vehicle in which the vehicle control unit is disposed, in accordance with an embodiment of the present
15 subject matter.

[00015] Fig. 2 illustrates a block diagram of the vehicle comprising the vehicle control unit, in accordance with an embodiment of the present subject matter.
[00016] Fig. 3 illustrates a flowchart of a method for controlling the vehicle
20 using the vehicle control unit, in accordance with an embodiment of the present subject matter.
[00017] Fig. 4 illustrates a flowchart for lane change prediction, in accordance with an embodiment of the present subject matter.
[00018] Fig. 5 illustrates a flowchart for blind spot detection, in accordance
25 with an embodiment of the present subject matter.

[00019] Fig.6 illustrates a flowchart for speed control/recommendation mechanism, in accordance with an embodiment of the present subject matter.

DETAILED DESCRIPTION

[00020] The present disclosure may be best understood with reference to the detailed figures and description set forth herein. Various embodiments are discussed below with reference to the figures. However, those skilled in the art
5 will readily appreciate that the detailed descriptions given herein with respect to the figures are simply for explanatory purposes as the methods and systems may extend beyond the described embodiments. For example, the teachings presented and the needs of a particular application may yield multiple alternative and suitable approaches to implement the functionality of any detail
10 described herein. Therefore, any approach may extend beyond the particular implementation choices in the following embodiments described and shown.
[00021] References to “one embodiment,” “at least one embodiment,” “an embodiment,” “one example,” “an example,” “for example,” and so on indicate that the embodiment(s) or example(s) may include a particular feature,
15 structure, characteristic, property, element, or limitation but that not every embodiment or example necessarily includes that particular feature, structure, characteristic, property, element, or limitation. Further, repeated use of the phrase “in an embodiment” does not necessarily refer to the same embodiment.
[00022] The present invention now will be described more fully hereinafter
20 with different embodiments. This invention may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein; rather those embodiments are provided so that this disclosure will be thorough and complete, and fully convey the scope of the invention to those skilled in the art.
25 [00023] The present subject matter is further described with reference to accompanying figures. It should be noted that the description and figures merely illustrate principles of the present subject matter. Various arrangements may be devised that, although not explicitly described or shown herein, encompass the principles of the present subject matter. Moreover, all statements herein reciting

principles, aspects, and examples of the present subject matter, as well as specific examples thereof, are intended to encompass equivalents thereof.
[00024] Various features and embodiments of the present subject matter here will be discernible from the following further description thereof, set out
5 hereunder. It is contemplated that the concepts of the present subject matter may be applied to any kind of vehicle within the spirit and scope of this subject matter. The detailed explanation of the constitution of parts other than the present subject matter which constitutes an essential part has been omitted at suitable places.
10 [00025] It is an object of the present invention to reduce a number of electronic control units and have a single integrated vehicle control unit for performing all functions and additional vehicle diagnostics functions of the vehicle. It is further an object of the present invention to provide the vehicle control unit compatible with different vehicles to perform vehicle diagnostics
15 and multiple other functions. It is further an object of the present invention to overcome space constraint while assembling one or more control units in the vehicle layout. It is further an object of the present invention to reduce the part count by reducing the number of mounting brackets and fasteners required for mounting the one or more control units in the vehicle. It is further an object of
20 the present invention to reduce the number of lengthy wires required to connect the one or more control units with the existing vehicle control unit. It is further an object of the present invention to provide the vehicle control unit that is configured to perform a plurality of functions comprising motor control functions, vehicle body control functions, mapping environment data,
25 predicting lane change, detecting blind spot, controlling speed, alerting functions, battery monitoring functions, and ARAS functions.
[00026] The present subject matter along with all the accompanying embodiments and their other advantages would be described in greater detail in conjunction with the figures in the following paragraphs.

[00027] The present subject matter discloses a vehicle having a vehicle control unit being disposed in a cavity formed in a space between the pair of upper tubes, pair of central tubes, and the rear portion of the pair of down tubes The vehicle control unit is mounted to the pair of central tubes to conform with
5 the shape of the cavity through one or more mounting points close to the centre of gravity (GG’) of the vehicle.
[00028] In another embodiment, the vehicle control unit is mounted in a front portion of the vehicle, disposed below a handlebar assembly. In yet another embodiment, the vehicle control unit is capable of performing blind spot
10 detection and lane change prediction by creating environment map data, and speed control or recommendation mechanism. In another embodiment, the vehicle control unit is capable of sending one or more alerts to the driver of the vehicle based on the blind spot detection mechanism, lane change prediction mechanism and speed control or recommendation mechanism.
15 [00029] The present subject matter further discloses a method for lane change prediction by vehicle control unit. It further discloses a method for blind spot detection by vehicle control unit. It further discloses a method for speed control and recommendation by vehicle control unit.
[00030] The present subject matter is further described with reference to
20 accompanying figures. It should be noted that the description and figures merely illustrate principles of the present subject matter. Various arrangements may be devised that, although not explicitly described or shown herein, encompass the principles of the present subject matter. Moreover, all statements herein reciting principles, aspects, and examples of the present subject matter, as well as
25 specific examples thereof, are intended to encompass equivalents thereof.

[00031] The present subject matter may be implemented in any two-wheeled vehicle. However, for the purpose of explanation and by no limitation, the present invention, and corresponding additional advantages and features are described through the following embodiments depicting a two wheeled vehicle.

[00032] Figure 1 shows a side elevational view of a motorcycle incorporating the invention.
[00033] With reference to Figure 1, 100 denotes a vehicle, such as a motorcycle, 2 denotes a front wheel, 3 denotes a rear wheel, 4 denotes a front
5 fork, 5 denotes a seat, 6 denotes a rear fork, 7 denotes a leg shield made of resin or metal, 8 denotes a headlight, 9 denotes a tail light, 10 denotes an aesthetic covering, 11 denotes a battery fitted inside the aesthetic covering, 12 denotes a fuel tank, and 13 denotes a handle bar. Further, a main frame extends along a center of a body of the vehicle from a front portion of the vehicle and extending
10 in a rearwardly direction. The main frame is made up of a metallic pipe.

[00034] In an embodiment, the vehicle 100 may be a scooter type vehicle and may have main frame that extends along a center of the body of the vehicle from a front portion of the vehicle and extending in a rearwardly direction. The main frame is made up of a metallic pipe and the main frame is provided under
15 the floor board for a scooter type vehicle. A swing type power unit is coupled to the rear end of the main frame for a scooter type vehicle. A rear wheel is supported on one side of the rear end of the swing type power unit. In an embodiment, the swing type power unit is suspended in the rear of a body frame for a scooter type vehicle.
20 [00035] The center of the body for a scooter type vehicle forms a low floor board for functioning as a part for putting feet and a under cowl which is located below a rider's seat and covers at least a part of the engine. In an embodiment, the under cowl is made up of metal or resin. The under cowl is hinged to the seat. Further, a utility box opens from the rear end to hinged portion. In an
25 embodiment, the utility box is provided under the seat extending longitudinally of a vehicle body and the inside of the utility box has a large capacity so that a large article, such as a helmet can be housed. Additionally, in a scooter type vehicle, side covers both on left and right sides, cover the utility box and other parts of the vehicle, thereby providing a good appearance to the vehicle.

[00036] In an embodiment, the vehicle (100) comprises a frame assembly when viewed from a front (F) to rear (R) direction of the vehicle (100). The frame assembly comprises a headtube and a pair of upper tubes extending from the headtube from both the left and right sides. A pair of intermediate tube
5 extends downwardly from the head tube in the front portion (F) of the vehicle (100). A pair of front suspension is mounted in the front portion (F) of the vehicle (100) connecting a front wheel of the vehicle.
[00037] The vehicle (100) is configured to move through the front wheel and a rear wheel. The pair of intermediate tube joins to a front portion of a pair of
10 down tubes on both the left and right side of the vehicle (100). The pair of upper tubes extends rearwardly backward from the head tube, and the pair of upper tubes terminating at a pair of central tubes. The pair of central tubes is configured to be joined at a rear portion of the pair of down tubes. The pair of central tubes is also configured to have mounting provisions of a centre stand.
15 [00038] The vehicle (100) includes a seat being disposed at the rear end of the vehicle (100), and the seat is mounted on a pair of seat rails of the frame assembly. The pair of seat rails have a trellis internal feature created by one or more connecting tubes. This structure provides and strength and rigidity to the overall frame assembly of the vehicle (100). The vehicle (100) is configured to
20 have a mono shock absorber being disposed in the rear portion of the vehicle for absorbing impacts from the road. Further a transmission system for transmitting power to the rear wheel is disposed in the rear portion of the vehicle (100). In one embodiment, the vehicle (100) is a two-wheeled and a three- wheeled vehicle.
25 [00039] A vehicle control unit is disposed in a cavity of the vehicle (100). the cavity is formed in a space between the pair of upper tubes, the pair of central tubes and the rear portion of down tubes. The vehicle control unit is mounted to the pair of central tubes to conform with the shape of the cavity through one or more mounting points close to the centre of gravity (GG’) of the vehicle (100).
30 In another embodiment, the vehicle control unit is placed in a front portion of

the vehicle (100). In another embodiment, the vehicle control unit being mounted in a cavity, the cavity being defined between a pair of upper tubes (110a), a pair of central tubes (112a) and a rear portion of down tubes (108a) of the vehicle, the vehicle control unit being mounted close to centre of gravity
5 (GG’) of the vehicle (100). ). In another embodiment, the vehicle control unit
(130) is disposed in a front portion of the vehicle (100), being disposed below a handlebar assembly and in front of the head tube.
[00040] Fig. 2 illustrates a block diagram of the vehicle 100 comprising the vehicle control unit 101, in accordance with an embodiment of the present
10 subject matter. The vehicle 100 comprises the vehicle control unit 101, a plurality of sensors 116, an instrument cluster 118, an engine 114, a battery 112, an ISG machine 110, an input/output unit 111, a power supply 108, a transceiver
106. The vehicle control unit 101, the plurality of sensors 116, the instrument cluster 118, the engine 114, the battery 112, the ISG machine 110, the
15 input/output unit 111, the power supply 108, the transceiver are communicatively coupled with each other via wired or wireless networks.
[00041] The vehicle control unit 101 comprises a processor 102, and a memory 104. Further, the processor 102 comprises a plurality of logical units comprising an environment data generation unit 102a, a lane change prediction
20 unit 102b, blind spot detection unit 102c, speed recommendation unit 102d, alerting unit 102e.
[00042] The processor 102 may include suitable logic, circuitry, interfaces, and/or code that may be configured to execute a set of instructions stored in the memory 104. The processor 102 may be implemented based on a number of
25 processor technologies known in the art. The processor 102 may work in coordination with the transceiver 106, the lane prediction unit 102b, and the input/output unit 212 to predict the change of lane. Examples of the processor 102 include, but not limited to, an X86-based processor, a Reduced Instruction Set Computing (RISC) processor, an Application-Specific Integrated Circuit

(ASIC) processor, a Complex Instruction Set Computing (CIBC) processor, and/or other processor.
[00043] The memory 104 comprises suit able logic, circuitry, interfaces, and/or code that is configured to store the set of instructions, which may be
5 executed by the VCU 101. In an embodiment, the memory 104 may be configured to store one or more programs, routines, or scripts that may be executed in coordination with the VCU 101. The memory 104 may be implemented based on a Random Access Memory (RAM), a Read-Only Memory (ROM), a Hard Disk Drive (HDD), a storage server, and/or a Secure
10 Digital (SD) card.

[00044] The environment data generation unit 102a comprises suitable logic, circuitry, interfaces, and/or code that is configured to generate environmental map data based on the data received from the plurality of sensors.
[00045] The lane change prediction unit 102b comprises suitable logic,
15 circuitry, interfaces, and/or code that is configured to predict a probability of a lane change being performed by the rider, while travelling from the origin point to the destination point on the computed travel route, at a particular time instant based on a combination of a current GPS location of the vehicle, traffic data and the mapped environment data using a binary prediction model.
20 [00046] The blind spot detection unit 102c comprises suitable logic, circuitry, interfaces, and/or code that is configured to identify a location of one or more blind spots for the rider based on the data from the plurality of sensors while the rider is travelling on the computed travel route. In an embodiment, the location is at least one of a LH side, an RH side, a front side and a rear side of
25 the vehicle.

[00047] The speed recommendation unit 102d comprises suitable logic, circuitry, interfaces, and/or code that is configured to controlling speed of the vehicle based on a traffic sign status and associated duration of the traffic sign status and a historical riding pattern of the rider at a plurality of junctions

enroute on the travel route, the traffic sign status and the associated duration of the traffic sign status being determined based on at least one of the plurality of sensors or received from a central traffic server configured to provide Infrastructure as a Service (IaaS).
5 [00048] The alerting unit 102e comprises suitable logic, circuitry, interfaces, and/or code that is configured to one or more alerts to the rider while travelling on the computed travel route based on the predicted probability of the lane change and the identified location of the one or more blind spots.
[00049] The transceiver 106 may include suitable logic, circuitry, interfaces,
10 and/or code that may be configured to receive the real-time data from the plurality of sensors. The transceiver 106 may further be configured to transmit the one or more alerts to the instrument cluster or a smart helmet of the rider, via a communication network. The transceiver 106 may implement one or more known technologies to support wired or wireless communication with the
15 communication network. In an embodiment, the transceiver 106 may include, but is not limited to, an antenna, a radio frequency (RF) transceiver, one or more amplifiers, a tuner, one or more oscillators, a digital signal processor, a Universal Serial Bus (USB) device, a coder-decoder (CODEC) chipset, a subscriber identity module (SIM) card, and/or a local buffer. The transceiver
20 106 may communicate via wireless communication with networks, such as the Internet, an Intranet and/or a wireless network, such as a cellular telephone network, a wireless local area network (LAN) and/or a metropolitan area network (MAN). The wireless communication may use any of a plurality of communication standards, protocols and technologies, such as: Global System
25 for Mobile Communications (GSM), Enhanced Data GSM Environment (EDGE), wideband code division multiple access (W-CDMA), code division multiple access (CDMA), time division multiple access (TDMA), Bluetooth, Wireless Fidelity (Wi-Fi) (e,g., IEEE 802.11a, IEEE 802.11b, IEEE 802.11g and/or IEEE 802.11n), voice over Internet Protocol (VoIP), Wi-MAX, a
30 protocol for email, instant messaging, and/or Short Message Service (SMS),

[00050] The input/output unit 111 comprises suitable logic, circuitry, interfaces, and/or code that is configured to receive an input or transmit an output to the instrument cluster or a smart helmet of the rider. The input/output unit 212 may include various input and output devices that are configured to
5 communicate with the processor 102. Examples of the Input devices include, but are not limited to, a keyboard, a mouse, a joystick, a touch screen, a microphone, a camera, and/or a docking station. Examples of the output devices include, but are not limited to, a display screen and/or a speaker.
[00051] The battery 112 provides power supply to various electrical
10 components, such as the head light, tail light, and the like of the vehicle.

[00052] The engine 114 may correspond to an internal combustion engine for the vehicle which may be air cooled or water cooled. In an embodiment, the engine 114 may comprise a drive train for a vehicle, in particular for a motorcycle, with an automated sequential manual transmission which can be
15 shifted manually via a shift shaft and which can be connected to an internal combustion engine via a clutch.
[00053] The plurality of sensors 116 comprises transmits sensor data to VCU which is communicatively coupled.
[00054] The GPS sensor 116a is configured to determine a location of a rider
20 of the vehicle. In an embodiment, the GPS sensor may be installed on the vehicle or on a computing device that the rider may be carrying while driving the vehicle. The LIDAR sensors 116b is configured to calculate velocity and cartesian coordinates of one or more vehicles that are travelling towards the vehicle from either side of the vehicle. In an embodiment, the LIDAR is
25 disposed at a rear end of the vehicle. The steering position sensor 116c provides a steering angle at which the vehicle will enter at least one of the city road or the freeway road.
[00055] The instrument cluster (118) with integrated Bluetooth module is connected to the VCU of the vehicle and the smart helmet or other connected

gadget through hardwired, CAN bus network and Bluetooth medium. The vehicle sensors such as side stand sensor, engine RPM sensor, vehicle speed sensor is connected to vehicle ECUs through hardwire. The instrument cluster
(118) and mobile application are connected through Bluetooth and data is
5 transferred and received between the devices.

[00056] In operation, the vehicle control unit being an integrated vehicle control unit that is configured to combine functioning of a plurality of electronic control units and perform a plurality of functions, the plurality of functions comprising motor control functions, vehicle body control functions, mapping
10 environment data, predicting lane change, detecting blind spot, controlling speed, alerting functions, battery monitoring functions, and ARAS functions.
[00057] The vehicle body control functions include controlling operation of a side stand, opening of a fuel tank lid, or opening of a lid that covers a power source used for driving the vehicle, the vehicle functionalities include
15 controlling intensity, operating frequency and wavelength associated with the of the one or more light sources, providing a current to charge one or more external devices, and the ARAS functions include detecting driving drowsiness, controlling vehicle, sending alerts, automatic controlling of TSL, automated parking.
20 [00058] The VCU receives a real-time data from a plurality of sensors. In an embodiment, the plurality of sensors comprises a first lidar sensor mounted in a front portion of the vehicle, a second lidar sensor mounted in a rear portion of the vehicle, a first image sensor mounted in the front portion of the vehicle, a second image sensor mounted in the rear portion of the vehicle, a first
25 proximity sensor mounted on a LH side of a handle bar assembly of the vehicle, a second proximity sensor mounted on a RH side of the handle bar assembly of the vehicle, a GPS sensor, IMU sensor, a speed sensor.
[00059] Below table A represents a scenario which illustrates how the real time environment data looks like.

Sensor Real time Data
GPS sensor x latitude and y longitude
Current speed 40km/h
No. of vehicles in the front 4
No. of vehicles in the rear 6
IMU sensor Lean angle = 35 degrees
Proximity sensor LH 1 vehicle is closeby to front LH
Proximity sensor RH 1 vehicle is closeby to rear RH
Image sensor Traffic sign is green color and time duration is 45 seconds
4 trees along LH of vehicle

5 pedestrians waiting for crossing the road at junction which will be approached in 200m and 30 sec
Road lane markings

Speed limit markings on road
i.e. 80km/h
Origin location ABC
Destination Location XYZ
Distance between origin and destination 5km

Table A

[00060] In an embodiment, real-time data comprises a distance between the vehicle and surrounding vehicles, traffic sign status data, road lane markings, current location of the vehicle, and current speed of the vehicle. For example, the distance between the vehicle and the surrounding vehicle may be 0.3 metres,
5 traffic sign status shows green color, and the current speed may be 40km/h. The environment map generation unit helps in generating in real-time, environmental map data based on the data received from the plurality of sensors. The plurality of sensors as described in table can be image sensor, proximity sensor, IMU sensor, speed sensor and like. Each of the sensors sense the
10 respective data and provides the input to VCU. For example, image sensor captures the surrounding images and provides the user information pertaining to nearby trees, vehicles, traffic sign status, pedestrians, speed breakers, reflectors, road lane markings. Proximity sensors also helps to get information about objects which are in close proximity to the rider’s vehicle. Speed sensors
15 senses the information pertaining to speed of the vehicle.

[00061] In an embodiment, the environmental map data comprises one or more vehicles surrounding the vehicle, one or more objects within a travel path of the vehicle. In an embodiment, the one or more objects comprises trees, sign boards, pedestrians, speed breakers, reflectors.
20 [00062] In an embodiment, computing, by the vehicle control unit, a travel route for travelling from an origin point and a destination point based on the real-time environmental map data. In an embodiment, the origin point, and the destination point are received from a rider of the vehicle. As depicted in Table A, the travel route from origin point ABC to destination point XYZ, will be
25 computed by the VCU based on the real time environment map data meaning, the nearby trees, vehicles, traffic sign status, pedestrians, speed breakers, reflectors, road lane markings will be captured in real time and provides information to the rider and helps the rider to get alerts for lane change, blind spots if any, and speed recommendations.

[00063] Based on the real-time environmental map data being updated as shown in Table A, the travel route may be computed such that the distance between origin and destination is covered while ensuring safety of the rider and the surrounding vehicles. For example, as the VCU is aware that 5 pedestrians
5 are waiting near the junction for crossing after 30 seconds then in an embodiment, the VCU may recommend the rider to slow down or would suggest an alternate route to avoid any speed breakers, pedestrians or any kind of hindrances.
[00064] In an embodiment, predicting, by the vehicle control unit, a
10 probability of a lane change being performed by the rider, while travelling from the origin point to the destination point on the computed travel route, at a particular time instant based on a combination of a current GPS location of the vehicle, traffic data and the mapped environment data using a binary prediction model. For example, if the rider is riding the vehicle in lane A, and image sensor
15 captures and provides information relating to speed breakers present in lane A, lane change prediction unit provides the lane change alert to the rider by either illuminating the edges of the side view mirror or by displaying an alert on the instrument cluster. The alerts can be in form of illuminating lamp, or audible sound, change in intensity, frequency or wavelength of the lamp, haptic
20 feedback and like.

[00065] In an embodiment, predicting the probability of the lane change comprises receiving, by the vehicle control unit, image data from one or more image sensors mounted in a front portion and a rear portion of the vehicle, IMU data, TSL lamp data, correlating, by the vehicle control unit, the received data
25 along with data received from the plurality of sensors for predicting the probability of the lane change based on one or more statistical techniques. In an embodiment, an output of the correlation is fed as input to the binary prediction model to determine the probability of the lane change. For example, if the rider is riding in lane B which is about to take a turn. Image sensors provides the
30 information that there is a pothole in lane B and also provides there are blind

spots towards the rear side of the vehicle, IMU sensor provides information that the vehicle is leaning at 35 degrees towards the turn. The present subject matter helps in predicting the probability of the lane change in order to help rider to switch to lane C in order to avoid potholes and to have a lean angle which may
5 be 60 degrees which provides utmost safety to the rider.

[00066] In an embodiment, predicting the probability of the lane change based on a set of pre-defined conditions, and the set of pre-defined conditions being at least one of a first condition indicative of a camera and steering angle sensor data determining a lane change and speed of vehicle being greater than
10 zero and a second condition indicative of a TSL associated with a first direction being ON and vehicle movement being in an opposite direction. For example, if the rider manually gives indication that he is shifting towards left lane. However, the VCU monitors that the parameters show that vehicle is leaning towards right lane, VCU will understand and provides alert as per right lane
15 only by illuminating right edge of right side view mirror.

[00067] In an embodiment, identifying, by the vehicle control unit, a location of one or more blind spots for the rider based on the data from the plurality of sensors while the rider is travelling on the computed travel route, and the location is at least one of a LH side, an RH side, a front side and a rear side of
20 the vehicle. For example, VCU identifies one or more blind spots present in the computed travel route from ABC to XYZ which can be either at rear LH or rear RH side of the vehicle, the information pertaining to blind spots is captured by image sensors.
[00068] In an embodiment, providing, by the vehicle control unit, one or
25 more alerts to the rider while travelling on the computed travel route based on the predicted probability of the lane change and the identified location of the one or more blind spots. For example, VCU provides visual, audible, tactile stimulation alerts to the rider on identifying lane change of lane from lane B to lane C or blindspots on the computed travel route.

[00069] In an embodiment, the one or more alerts comprises at least one of indicating, by the vehicle control unit, a location of the one or more blind spots along a plurality of sides of a side view mirror of the vehicle. In an embodiment, each of the plurality of sides of the side view mirror are mapped with
5 corresponding at least one of the LH side, the RH side, the front side and the rear side of the vehicle. In an embodiment, the indication comprises illuminating one or more light sources disposed along the plurality of sides of the side view mirror. For example, illuminating the edges of the side view mirror if the rider is about to change a lane. For example, if the rider is about to
10 change the lane towards left side, the left edge of left mirror will start illuminating, if the rider is about to change the lane towards right side, the right edge of right rear view mirror will start illuminating. This helps the surrounding rider to be more aware whether the rider is about to change a lane.
[00070] In an embodiment, indicating, by the vehicle control unit, one or
15 more regions comprising the one or more blind spots in real-time on an instrument cluster of the vehicle, a pictorial representation of the vehicle is displayed on at least one of the instrument cluster, a visor of a smart helmet, a heads-up display unit of the vehicle and the one or more regions being highlighted on the instrument cluster. For example, instrument cluster displays
20 the pictorial representation of the vehicle along with the one or more blind spots which can be either side of the vehicle. If the rider is riding in lane A, while cornering, he may encounter blind spots which can be displayed on the instrument cluster itself or can be displayed on visor of smart helmet, heads up display unit.
25 [00071] In an embodiment, providing, by the vehicle control unit, haptic feedback to the rider of the vehicle, before a predefined time before the particular time instant at which the rider will change the lane, in response to identification of the location of one or more blind spots. For example, while changing lane, lane change prediction unit provides alerts which can be in the

form of haptic feedback or tactile stimulation alerts when the rider is about to change the lane from lane B to lane C on facing difficulties in lane B.
[00072] In an embodiment, providing, by the vehicle control unit, at least one of a forward collision alert, a rear collision alert, a merging-traffic alert, lane
5 departure warning based on the environmental map data. For example, if a surrounding vehicle is about to collide the rider’s vehicle from rear side, VCU with the help of real time environment map data and plurality of sensors such as image sensor will detect the rear collision which is about to happen and give alerts to the rider in order to ensure safety. Additionally, it also give forward
10 collision alert, a rear collision alert, a merging-traffic alert, lane departure warning.
[00073] In an embodiment, controlling, by the vehicle control unit, speed of the vehicle based on a traffic sign status and associated duration of the traffic sign status and a historical riding pattern of the rider at a plurality of junctions
15 enroute on the travel route. For example, speed of the vehicle can be 30km/h as detected by speed sensors and traffic sign status can be green as detected by image sensor.
[00074] In an embodiment, the traffic sign status comprises one of: red, yellow, green, and each traffic sign status has an associated duration. For
20 example, traffic sign status shows red light and duration is 30 seconds or green light with duration of 15 seconds.
[00075] In an embodiment, the historical riding pattern comprises braking timing, acceleration timing, and gear shift patten. In an embodiment, the traffic sign status and the associated duration of the traffic sign status being determined
25 based on at least one of the plurality of sensors or received from a central traffic server configured to provide Infrastructure as a Service (IaaS). For example, if a rider is riding at a speed and the traffic light is green. However, it is about to turn red as shown in the duration (15 sec) status of traffic sign. In conventional systems, the rider used to accelerate the vehicle and the moment it reaches the

traffic signal, it turns red, leading to extreme wastage of fuel. Such fuel wastage can be prevented with the present subject matter.
[00076] In an embodiment, controlling comprises recommending, by the vehicle control unit, at least one of maintaining a current speed, accelerating to
5 a first speed or decelerating to a second speed and maintaining at least one of the first speed or the second speed from the current speed of the vehicle based on the traffic sign status and the associated duration of the traffic sign status at a plurality of junctions enroute on the travel route. In an embodiment, in an autonomous driving mode of the vehicle, the vehicle control unit being
10 configured to perform the recommendations and controlling speed of the vehicle based on environmental map data. For example, , if a rider is riding at a speed 30km/h and the traffic light is green with duration 15 seconds. VCU will provide speed recommendation of may be 55km/h as per the history and riding pattern of the rider which can be safely achieved by the rider to cross the
15 junction within time and also avoids fuel wastage. In another example, it may happen that rider is riding at 60 km/h and traffic sign status shows green, VCU may provide speed recommendation which says to maintain constant speed in order to cross the junction. In another example, it may happen that the rider is riding at 70km/h and speed limit of the road is 60 km/h, in this case, the VCU
20 will provide recommendation to reduce the speed and abide by the traffic rules and provides utmost safety to rider and surrounding riders. In another example, it may happen that rider is riding at 30 km/h and traffic sign status shows green and duration is 5 seconds. The VCU will provide speed recommendation to reduce the speed as there is no way to cross the junction even if the speed has
25 been accelerated.

[00077] In an embodiment, remotely flashing the plurality of functions to the vehicle control unit using OTA functionality. For example, if the vehicle functions has to be upgraded, it can be done remotely using over the air functionality and rider does not need to bring the vehicle to third party or at a
30 service center to upgrade it. Thus, providing remote upgradation of the vehicle

and advanced service to the riders. Further, the plurality of functions may be provided as a subscription service to the rider for a pre-defined period of time and after the pre-defined time is over then automatically at least one of the plurality of functions may be disabled.
5 [00078] In a working example, let us consider a rider riding at a 40km/hr speed from an origin point to a destination point. As the rider provides the inputs of the origin and destination point to the vehicle, the environment data generation unit generates a real time environment map data for the mentioned route. The real time environment map data enables the rider to be aware of the
10 surrounding objects- trees, speed breakers, traffic sign status and surrounding vehicles. Other action units such as the lane change prediction unit, blind spot detection unit, speed recommendation unit and alerting unit serves various purposes such as making rider more aware by predicting change of lane. Additionally, it sensitizes surrounding riders to be more aware if the rider is
15 about to change the lane. This is done by several ways. One way can be illuminating the edges of the side view mirror if the rider is about to change a lane. For example, if the rider is about to change the lane towards left side, the left edge of left mirror will start illuminating, if the rider is about to change the lane towards right side, the right edge of right rear view mirror will start
20 illuminating. This helps the surrounding rider to be more aware whether the rider is about to change a lane.
[00079] Similarly, for blind spot detection, it helps the rider to prevent accidents due to blind spots. For example, if the rider is taking a turn, the surface below the rear wheel becomes a blind spot and there is a high possibility of
25 rider vulnerable to fall if any pothole comes. However, with the present subject matter, the blind spot detection unit enables the rider to be aware about the blind spot. Additionally, it can also display that there are blind spots and alerts the rider to be more careful by several alerting mechanisms such as illuminating lamp or displaying on instrument cluster and rider can prevent such accidents
30 and injuries caused due to such blind spots. The blind spot detection method

also helps the surrounding riders to be aware that they fall into the blind spot zone of the rider and thus surrounding riders will also try to maintain sufficient distance to avoid collision with the rider.
[00080] For speed recommendation unit, it helps the rider to avoid wasting
5 the fuel, prevent the rider from violating traffic rules, helps the rider to predict whether there is a possibility of crossing the traffic light before it turns red again or preventing him from accelerating the vehicle if there is no possibility of crossing the traffic light before it turns red. For example, if a rider is riding at a speed and the traffic light is green. However, it is about to turn red as shown in
10 the duration (15 sec) status of traffic sign. In conventional systems, the rider used to accelerate the vehicle and the moment it reaches the traffic signal, it turns red, leading to extreme wastage of fuel. Such fuel wastage can be prevented with the present subject matter. For example, if the rider is facing the same situation now, the VCU, based on the historical riding pattern of the rider
15 and the infrastructural data of traffic sign status, can determine the suitable speed for which the rider should ride in order to cross the traffic signal and thus provides speed recommendations in order to prevent fuel wastage.
[00081] Thus, the claimed VCU recommends the user to accelerate to a speed of 65km/h from the current speed of 40km/h so that within next 10
20 seconds the rider can cross the junction where the traffic sign is installed. While recommending the acceleration to the rider, the VCU takes into account safety of the rider as well as the environment map data received from the plurality of sensors.
[00082] Fig. 3 illustrates a flowchart of a method 300 for controlling the
25 vehicle 100 using the vehicle control unit 101, in accordance with an embodiment of the present subject matter.
[00083] The method starts at step 302 and proceeds to step 304. At step 304, the vehicle control unit 101 is configured to receiving real-time data from a plurality of sensors. At step 306, the vehicle control unit 101 is configured to
30 generating in real-time environmental map data based on the data received from

the plurality of sensors. At step 308, the vehicle control unit 101 is configured to computing a travel route for travelling from an origin point and a destination point based on the real-time environmental map data. In an embodiment, the origin point, and the destination point are received from a rider of the vehicle
5 [00084] At step 310, the vehicle control unit 101 is configured to predicting a probability of a lane change being performed by the rider, while travelling from the origin point to the destination point on the computed travel route, at a particular time instant based on a combination of a current GPS location of the vehicle, traffic data and the mapped environment data using a binary prediction
10 model.

[00085] At step 312, the vehicle control unit 101 is configured to identifying a location of one or more blind spots for the rider based on the data from the plurality of sensors while the rider is travelling on the computed travel route. In an embodiment, the location is at least one of a LH side, an RH side, a front side
15 and a rear side of the vehicle. At step 314, the vehicle control unit 101 is configured to providing one or more alerts to the rider while travelling on the computed travel route based on the predicted probability of the lane change and the identified location of the one or more blind spots. Control passes to end step 316.
20 [00086] Referring to Fig. 4, which illustrates a flow chart for lane change detection. A plurality of Lidar sensors (1 mounted at front, 1 mounted near tail lamp), proximity sensors (1 on LHS of handlebar, and 1 on RHS of handlebar), image sensor (1 mounted in front and 1 mounted near tail lamp) is being used in the vehicle (100). At step 405, an environment map data is created based on
25 the sensors to identify vehicles in front and behind the rider in real-time during riding of the vehicle through through camera image and steer encoder or IMU sensor data and turn signal lamp data. At step 410, receiving navigation data from a rider that specifies an origin point and a destination point. At step 420, determining a route from origin point to a destination point. At step 430, based
30 on the current GPS location and traffic data and the mapped environment data,

predicting a probability of lane change being performed by the rider a particular time instant using a binary prediction model. The vehicle control unit (130) detects that if a camera and steer encoder recognize a lane change and speed data is also greater than zero (step 430a) or if a turn signal is applied and vehicle
5 roll is opposite from a turn signal (step 430b). The vehicle control unit (130) at step 435 generates a lane change detection alert to user.
[00087] Referring to Fig. 5, which illustrates a flowchart depicting a method for blind spot detection, at step 505, based on the data received from the sensor including camera and face orientation of rider, lane change prediction output,
10 the vehicle control unit (130) is configured to identify one or more blind spots for the rider. At step 510, the vehicle control unit (130) calculates the vision or reach of the rider. At step 515, displaying a location of the blind spot on a display of the instrument cluster or a smart helmet or on a visor of the vehicle to alert the rider. Finally, at step 520, providing haptic feedback when a blind
15 spot is detected.

[00088] Referring to Fig. 6, which depicts a flowchart illustrating a method for speed control or recommendation mechanism to the rider by the vehicle control unit. At step 605, analyzing traffic sign status for a particular traffic sign using LIDAR input. At step 610a, determining time required to reach the traffic
20 sign and then based on the historical riding behavior of the rider or at step 610b, measuring the distance of the vehicle with upcoming vehicle, providing a recommendation to either accelerate or maintain a constant speed of Xkm/hr so that by the time the rider reaches the traffic sign, the traffic sign would be green at step 615. This would avoid wait time at the traffic signal and also would avoid
25 unnecessary acceleration by the rider as the system already knows that even after Y acceleration the rider will not be able to cross the traffic sign. This would help in better fuel efficiency. Thus, at step 620, the driver can either control brake or acceleration.
[00089] Various alert notifications are sent to the rider such as blind spot
30 detection, forward collision alert, rear collision alert, cross-traffic alert,

merging-traffic alert, lane departure warning, and traffic sign status (red/orange/green).
[00090] Thus, a single centralized vehicle control unit is provided to perform motor control functions, vehicle body control functions, control vehicle
5 functionalities, and ARAS functions and to upgrade existing vehicles and provide better advanced functionality. The VCU is capable of performing motor control functions, vehicle body control functions including side stand, fuel tank lid, low beam and high beam lights, hazard lights controlling, mobile charger, reverse park assist, ARAS functions including detecting driver drowsiness and
10 control vehicle for safety, sending alerts, automatic TSL light ON/OFF, automated parking for two-wheeled vehicles. Thus, a single vehicle control unit combines the functionalities of individual ECUs.
[00091] The present subject matter offers an integrated vehicle control unit that is configured to combine functioning of a plurality of electronic control
15 units and perform a plurality of functions such as motor control functions, vehicle body control functions, mapping environment data, predicting lane change, detecting blind spot, controlling speed, alerting functions, battery monitoring functions, and ARAS functions. This offers a great advantage as in conventional systems, separate control units were present. However, in the
20 present subject matter, single integrated VCU is capable of performing all mentioned functions.
[00092] The present subject matter offers an advantage of having a real time data which then generates real time environment map data comprises one or more vehicles surrounding the vehicle, one or more objects within a travel path
25 of the vehicle. One or more objects comprises trees, sign boards, pedestrians, speed breakers, reflectors. Thus by claiming generating in real-time environmental map data based on the data received from the plurality of sensors, the present subject matter offering an advantage of making the rider more aware and even sensitizing the surrounding riders which further provides more safety
30 to the public at large.

[00093] In conventional systems, only lane departure warning is present. However, the present subject matter offers lane change prediction alert as well. Thus by claiming predicting a probability of a lane change while travelling from the origin point to the destination point on the computed travel route, at a
5 particular time instant based on a combination of a current GPS location of the vehicle, traffic data and the mapped environment data using a binary prediction model, the present subject matter offers an advantage by helping rider to switch to another lane on facing difficulties like potholes or speed breakers in the present lane which provides utmost safety and comfort to the rider as well as
10 the surrounding vehicles.

[00094] The present subject matter also offers an advantage of identifying a location of one or more blind spots for the rider based on the data from the plurality of sensors while the rider is travelling on the computed travel route. The blind spot detection helps the rider to be more aware and prevent accidents
15 and injuries to himself as well as surrounding riders.

[00095] The present subject matter also offers an advantage of providing, one or more alerts to the rider while travelling on the computed travel route based on the predicted probability of the lane change and the identified location of the one or more blind spots. This helps in alerting the rider to be more careful
20 before encountering blindspots or while changing lane. Thus, ensuring safety of riders and surrounding vehicles. Road riding is a very dangerous task. Thus, the claimed subject matter to provides safety features to the public at large as every life matters.
[00096] The present subject matter offers a speed recommendation to the
25 rider where recommendation can be either to accelerate and maintain accelerated speed or decelerate and maintain decelerated speed or no change but to maintain constant speed while encountering traffic sign status which prevents unnecessary over speeding and wastage of fuel.
[00097] In conventional vehicles, if a customer wants to upgrade the vehicle
30 for new functionalities, then it is not possible to upgrade the vehicle to the

maximum potential and the time required for upgrading is high. There are several drawbacks associated with multiple control units such as more complexity of design, usage of higher quantities of wiring harness and lack of usable space in the vehicle, difficult to upgrade, addition of three or more
5 electronic control units requires different logic to be embedded and becomes a costly affair, increase overall assembly time, servicing time, upgrading time and overall overhauling time. Thus, the present subject matter offers an advantage of having single integrated VCU with OTA functionality which provides easy upgradation over the air, provides more usable space, and reduces part number
10 and housing components. By claiming the flashing the plurality of functions to the vehicle control unit using OTA functionality, the present subject matter solves the above mentioned problem.
[00098] In conventional vehicles, multiple electronic control units (ECUs) are utilized to control functions of each unit. The present subject matter offers
15 an advantage of having integrated single VCU which ensures reduced wiring length, reduced housing components for separate ECUs, reduced mounting brackets which provides overall reduced cost.
[00099] While certain features of the claimed subject matter have been illustrated and described herein, many modifications, substitutions, changes,
20 and equivalents will now occur to those skilled in the art. It is, therefore, to be understood that the appended claims are intended to cover all such modifications and changes that fall within the true spirit of the claimed subject matter.

25
,CLAIMS:CLAIMS

We claim:

1. A method for controlling a vehicle, the method comprising:

receiving, by a vehicle control unit, real-time data from a plurality
5 of sensors;

generating in real-time, by the vehicle control unit, environmental map data based on the data received from the plurality of sensors;

computing, by the vehicle control unit, a travel route for travelling from an origin point and a destination point based on the real-time
10 environmental map data, wherein the origin point, and the destination point are received from a rider of the vehicle;
predicting, by the vehicle control unit, a probability of a lane change being performed by the rider, while travelling from the origin point to the destination point on the computed travel route, at a particular time instant
15 based on a combination of a current GPS location of the vehicle, traffic data and the mapped environment data using a binary prediction model;
identifying, by the vehicle control unit, a location of one or more blind spots for the rider based on the data from the plurality of sensors while the rider is travelling on the computed travel route, wherein the location is
20 at least one of a LH side, an RH side, a front side and a rear side of the vehicle; and
providing, by the vehicle control unit, one or more alerts to the rider while travelling on the computed travel route based on the predicted probability of the lane change and the identified location of the one or more
25 blind spots.

2. The method for controlling the vehicle as claimed in claim 1, wherein the one or more alerts comprises at least one of:

indicating, by the vehicle control unit, a location of the one or more blind spots along a plurality of sides of a side view mirror of the vehicle,
5 wherein each of the plurality of sides of the side view mirror are mapped with corresponding at least one of the LH side, the RH side, the front side and the rear side of the vehicle, wherein the indication comprises illuminating one or more light sources disposed along the plurality of sides of the side view mirror;
10 indicating, by the vehicle control unit, one or more regions comprising the one or more blind spots in real-time on an instrument cluster of the vehicle, wherein a pictorial representation of the vehicle is displayed on at least one of the instrument cluster, a visor of a smart helmet, a heads- up display unit of the vehicle and the one or more regions being highlighted
15 on the instrument cluster;

providing, by the vehicle control unit, haptic feedback to the rider of the vehicle, before a predefined time before the particular time instant at which the rider will change the lane, in response to identification of the location of one or more blind spots; and

20 providing, by the vehicle control unit, at least one of a forward collision alert, a rear collision alert, a merging-traffic alert, lane departure warning based on the environmental map data.

3. The method for controlling the vehicle as claimed in claim 1, comprising
25 controlling, by the vehicle control unit, speed of the vehicle based on a traffic sign status and associated duration of the traffic sign status and a historical riding pattern of the rider at a plurality of junctions enroute on the travel route, wherein the traffic sign status and the associated duration of the

traffic sign status being determined based on at least one of the plurality of sensors or received from a central traffic server configured to provide Infrastructure as a Service (IaaS).

5 4. The method for controlling the vehicle as claimed in claim 3, wherein controlling comprises:
recommending, by the vehicle control unit, at least one of maintaining a current speed, accelerating to a first speed or decelerating to a second speed and maintaining at least one of the first speed or the second
10 speed from the current speed of the vehicle based on the traffic sign status and the associated duration of the traffic sign status at a plurality of junctions enroute on the travel route, wherein in an autonomous driving mode of the vehicle, the vehicle control unit being configured to perform the recommendations, and wherein controlling speed of the vehicle based on
15 environmental map data.

5. The method for controlling the vehicle as claimed in claim 3, wherein the traffic sign status comprises one of: red, yellow, green, wherein each traffic sign status has an associated duration.

20 6. The method for controlling the vehicle as claimed in claim 3, wherein the historical riding pattern comprises braking timing, acceleration timing, and gear shift patten.

25 7. The method for controlling the vehicle as claimed in claim 1, wherein predicting the probability of the lane change comprises:
receiving, by the vehicle control unit, image data from one or more image sensors mounted in a front portion and a rear portion of the vehicle, IMU data, TSL lamp data;

correlating, by the vehicle control unit, the received data along with data received from the plurality of sensors for predicting the probability of the lane change based on one or more statistical techniques, wherein an output of the correlation is fed as input to the binary prediction model to
5 determine the probability of the lane change.

8. The method for controlling the vehicle as claimed in claim 1, wherein predicting the probability of the lane change based on a set of pre-defined conditions, wherein the set of pre-defined conditions being at least one of:
10 a first condition indicative of a camera and steering angle sensor data determining a lane change and speed of vehicle being greater than zero; and
a second condition indicative of a TSL associated with a first direction being ON and vehicle movement being in an opposite direction.
15

9. The method for controlling the vehicle as claimed in claim 1, wherein the environmental map data comprises one or more vehicles surrounding the vehicle, one or more objects within a travel path of the vehicle, wherein the one or more objects comprises trees, sign boards, pedestrians, speed
20 breakers, reflectors.
10. The method for controlling the vehicle as claimed in claim 1, wherein the plurality of sensors comprises a first lidar sensor mounted in a front portion of the vehicle, a second lidar sensor mounted in a rear portion of the vehicle, a first image sensor mounted in the front portion of the vehicle, a second
25 image sensor mounted in the rear portion of the vehicle, a first proximity sensor mounted on a LH side of a handle bar assembly of the vehicle, a second proximity sensor mounted on a RH side of the handle bar assembly of the vehicle, a GPS sensor, IMU sensor, a speed sensor.

11. The method for controlling the vehicle as claimed in claim 1, wherein real- time data comprises a distance between the vehicle and surrounding vehicles, traffic sign status data, road lane markings, current location of the vehicle, and current speed of the vehicle.
5

12. The method for controlling the vehicle as claimed in claim 1, wherein the vehicle control unit being mounted in a cavity, the cavity being defined between a pair of upper tubes (110a), a pair of central tubes (112a) and a
10 rear portion of down tubes (108a) of the vehicle, wherein the vehicle control unit being mounted close to centre of gravity (GG’) of the vehicle (100).

13. The method for controlling the vehicle as claimed in claim 1, wherein the
15 vehicle control unit (130) is disposed in a front portion of the vehicle (100), being disposed below a handlebar assembly and in front of the head tube.

20 14. The method for controlling the vehicle as claimed in claim 1, wherein the vehicle control unit being an integrated vehicle control unit that is configured to combine functioning of a plurality of electronic control units and perform a plurality of functions, the plurality of functions comprising motor control functions, vehicle body control functions, mapping
25 environment data, predicting lane change, detecting blind spot, controlling speed, alerting functions, battery monitoring functions, and ARAS functions.

15. The method for controlling the vehicle as claimed in claim 14, wherein the
30 vehicle body control functions include controlling operation of a side stand, opening of a fuel tank lid, or opening of a lid that covers a power source

used for driving the vehicle, wherein the vehicle functionalities include controlling intensity, operating frequency and wavelength associated with the of the one or more light sources, providing a current to charge one or more external devices, and wherein the ARAS functions include detecting
5 driving drowsiness, controlling vehicle, sending alerts, automatic controlling of TSL, automated parking.

16. The method for controlling the vehicle as claimed in claim 1, comprising remotely flashing the plurality of functions to the vehicle control unit
10 using OTA functionality.

17. A vehicle control unit for a vehicle, the vehicle control unit comprising: a processor; and
a computer-readable medium communicatively coupled to the
15 processor, wherein the computer-readable medium stores processor- executable instructions, which when executed by the processor, cause the processor to:
receive real-time data from a plurality of sensors;

generate in real-time environmental map data based on the data
20 received from the plurality of sensors;

compute a travel route for travelling from an origin point and a destination point based on the real-time environmental map data, wherein the origin point, and the destination point are received from a rider of the vehicle;

25 predict a probability of a lane change being performed by the rider, while travelling from the origin point to the destination point on the computed travel route, at a particular time instant based on a combination of a current GPS location of the vehicle, traffic data and the mapped environment data using a binary prediction model;

identify a location of one or more blind spots for the rider based on the data from the plurality of sensors while the rider is travelling on the computed travel route, wherein the location is at least one of a LH side, an RH side, a front side and a rear side of the vehicle; and

5 provide one or more alerts to the rider while travelling on the computed travel route based on the predicted probability of the lane change and the identified location of the one or more blind spots.

18. The vehicle control unit for the vehicle as claimed in claim 17, wherein the
10 one or more alerts comprises at least one of:

indicating, by the vehicle control unit, a location of the one or more blind spots along a plurality of sides of a side view mirror of the vehicle, wherein each of the plurality of sides of the side view mirror are mapped with corresponding at least one of the LH side, the RH side, the front side
15 and the rear side of the vehicle, wherein the indication comprises illuminating one or more light sources disposed along the plurality of sides of the side view mirror;
indicating, by the vehicle control unit, one or more regions comprising the one or more blind spots in real-time on an instrument cluster
20 of the vehicle, wherein a pictorial representation of the vehicle is displayed on at least one of the instrument cluster, a visor of a smart helmet, a heads- up display unit of the vehicle and the one or more regions being highlighted on the instrument cluster;
providing, by the vehicle control unit, haptic feedback to the rider of
25 the vehicle, before a predefined time before the particular time instant at which the rider will change the lane, in response to identification of the location of one or more blind spots; and

providing, by the vehicle control unit, at least one of a forward collision alert, a rear collision alert, a merging-traffic alert, lane departure warning based on the environmental map data.

5 19. The vehicle control unit for the vehicle as claimed in claim 17, wherein the processor is configured to control speed of the vehicle based on a traffic sign status and associated duration of the traffic sign status and a historical riding pattern of the rider at a plurality of junctions enroute on the travel route, wherein the traffic sign status and the associated duration of the traffic sign
10 status being determined based on at least one of the plurality of sensors or received from a central traffic server configured to provide Infrastructure as a Service (IaaS), wherein the traffic sign status comprises one of: red, yellow, green, wherein each traffic sign status has an associated duration, and wherein the historical riding pattern comprises braking timing,
15 acceleration timing, and gear shift patten.

20. The vehicle control unit for the vehicle as claimed in claim 19, wherein controlling comprises:

recommending, by the vehicle control unit, at least one of
20 maintaining a current speed, accelerating to a first speed or decelerating to a second speed and maintaining at least one of the first speed or the second speed from the current speed of the vehicle based on the traffic sign status and the associated duration of the traffic sign status at a plurality of junctions enroute on the travel route, wherein in an autonomous driving mode of the
25 vehicle, the vehicle control unit being configured to perform the recommendations, and wherein controlling speed of the vehicle based on environmental map data.

21. The vehicle control unit for the vehicle as claimed in claim 17, wherein the processor is configured to predict the probability of the lane change comprises:

receiving, by the vehicle control unit, image data from one or more
5 image sensors mounted in a front portion and a rear portion of the vehicle, IMU data, TSL lamp data;
correlating, by the vehicle control unit, the received data along with data received from the plurality of sensors for predicting the probability of the lane change based on one or more statistical techniques, wherein an
10 output of the correlation is fed as input to the binary prediction model to determine the probability of the lane change.

22. The vehicle control unit for the vehicle as claimed in claim 17, wherein the processor is configured to predict the probability of the lane change based
15 on a set of pre-defined conditions, wherein the set of pre-defined conditions being at least one of:
a first condition indicative of a camera and steering angle sensor data determining a lane change and speed of vehicle being greater than zero; and
20 a second condition indicative of a TSL associated with a first direction being ON and vehicle movement being in an opposite direction.

23. The vehicle control unit for the vehicle as claimed in claim 17, wherein the environmental map data comprises one or more vehicles surrounding the
25 vehicle, one or more objects within a travel path of the vehicle, wherein the one or more objects comprises trees, sign boards, pedestrians, speed breakers, reflectors, wherein the plurality of sensors comprises a first lidar sensor mounted in a front portion of the vehicle, a second lidar sensor

mounted in a rear portion of the vehicle, a first image sensor mounted in the front portion of the vehicle, a second image sensor mounted in the rear portion of the vehicle, a first proximity sensor mounted on a LH side of a handle bar assembly of the vehicle, a second proximity sensor mounted on
5 a RH side of the handle bar assembly of the vehicle, a GPS sensor, IMU sensor, a speed sensor, and wherein real-time data comprises a distance between the vehicle and surrounding vehicles, traffic sign status data, road lane markings, current location of the vehicle, and current speed of the vehicle.
10

24. The vehicle control unit for the vehicle as claimed in claim 17, wherein the vehicle control unit being mounted in a cavity, the cavity being defined between a pair of upper tubes (110a), a pair of central tubes (112a) and a
15 rear portion of down tubes (108a) of the vehicle, wherein the vehicle control unit being mounted close to centre of gravity (GG’) of the vehicle (100).

25. The vehicle control unit for the vehicle as claimed in claim 17, wherein the
20 vehicle control unit (130) is disposed in a front portion of the vehicle (100), being disposed below a handlebar assembly and in front of the head tube.

25 26. The vehicle control unit for the vehicle as claimed in claim 17, wherein the vehicle control unit being an integrated vehicle control unit that is configured to combine functioning of a plurality of electronic control units and perform a plurality of functions, the plurality of functions comprising motor control functions, vehicle body control functions, mapping
30 environment data, predicting lane change, detecting blind spot, controlling

speed, alerting functions, battery monitoring functions, and ARAS functions.

27. The vehicle control unit for the vehicle as claimed in claim 17, wherein the
5 vehicle body control functions include controlling operation of a side stand, opening of a fuel tank lid, or opening of a lid that covers a power source used for driving the vehicle, wherein the vehicle functionalities include controlling intensity, operating frequency and wavelength associated with the of the one or more light sources, providing a current to charge one or
10 more external devices, and wherein the ARAS functions include detecting driving drowsiness, controlling vehicle, sending alerts, automatic controlling of TSL, automated parking.

28. The vehicle control unit for the vehicle as claimed in claim 17, wherein the
15 processor is configured to remotely flash the plurality of functions to the vehicle control unit using OTA functionality.

Documents

Application Documents

# Name Date
1 202241019814-PROVISIONAL SPECIFICATION [31-03-2022(online)].pdf 2022-03-31
2 202241019814-FORM 1 [31-03-2022(online)].pdf 2022-03-31
3 202241019814-DRAWINGS [31-03-2022(online)].pdf 2022-03-31
4 202241019814-DRAWING [31-03-2023(online)].pdf 2023-03-31
5 202241019814-CORRESPONDENCE-OTHERS [31-03-2023(online)].pdf 2023-03-31
6 202241019814-COMPLETE SPECIFICATION [31-03-2023(online)].pdf 2023-03-31
7 202241019814-FORM 18 [10-11-2023(online)].pdf 2023-11-10