Sign In to Follow Application
View All Documents & Correspondence

A System For Controlling A Vehicle And A Method Thereof

Abstract: ABSTRACT A SYSTEM FOR CONTROLLING A VEHICLE AND A METHOD THEREOF The present invention relates to a system (100) and method (200) for controlling a vehicle. The system comprises a first control unit (122) provided in the vehicle (130) and a second control unit (130) provided in a portable electronic device (120) and configured to be communicatively coupled to the first control unit (134). The second control unit (130) is configured to receive a signal indicative of one or more datasets from a plurality of sensors (124), receive information associated with one or more states of the vehicle (130) from the first control unit (134), determine one or more actions on the vehicle (130) based on the one or more datasets and the information associated with one or more states of the vehicle (130), and control the first control unit (134) of the vehicle (130) based on one or more actions. Reference Figure 1

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
20 March 2024
Publication Number
39/2025
Publication Type
INA
Invention Field
ELECTRONICS
Status
Email
Parent Application

Applicants

TVS MOTOR COMPANY LIMITED
“Chaitanya” No.12 Khader Nawaz Khan Road, Nungambakkam Chennai-600 006, Tamil Nadu India

Inventors

1. BALAGANESH SELVARAJAN
TVS Motor Company Limited “Chaitanya” No 12 Khader Nawaz Khan Road, Nungambakkam Chennai-600 006, Tamil Nadu India

Specification

Description:FIELD OF THE INVENTION
[001] The present invention relates to a vehicle. More particularly, the present invention relates to a system for controlling a vehicle.

BACKGROUND OF THE INVENTION
[002] In recent years, there has been a rapid advancement in the integration of smart technologies within vehicles to enhance their functionalities, safety, and overall user experience. These advancements includes Advanced Driver Assistance Systems (ADAS), Autonomous Emergency Braking (AEB) and Lane Keeping Assistance (LKA). Presently, smartphones are used as a medium for controlling vehicle steering actions, where users manually provide commands for remote vehicle operation. Typically, these smartphones rely on data from vehicle sensors for information to control vehicle operations.
[003] While this approach offers some level of control, existing system requires constant user intervention, limiting its practicality and safety. Further, existing smartphone-based vehicle control solutions rely on manual commands from users, which can be cumbersome and pose safety risks, especially in dynamic driving situations. Furthermore, integration of resource-heavy artificial intelligence (AI) applications in conventional vehicle control unit results in increased costs and complexities. The limitations of current vehicle controllers, unable to accommodate heavy AI applications due to limited computing power, further compound these challenges. Additionally, limited space within vehicles restricts the integration of bulky computing components necessary for implementing advanced safety features. Moreover, increasing number of electronic controllers in vehicles escalates resource demands and costs, affecting the seamless incorporation of new functionalities in the vehicle. Therefore, existing systems face limitations such as the need for user intervention, high cost, and space constraints for accommodating heavy computing components.
[004] In view of the foregoing, it is desirable to overcome a least the above-mentioned disadvantages of the prior art.

SUMMARY OF THE INVENTION
[005] In one aspect of the invention, a system for controlling a vehicle is disclosed. The system has a first control unit provided in the vehicle and a second control unit provided in a portable electronic device. The second control unit configured to be communicatively coupled to the first control unit. The second control unit is configured to receive a signal indicative of one or more datasets from a plurality of sensors. The second control unit is further configured to receive information associated with one or more states of the vehicle from the first control unit. The second control unit is configured to determine, based on the one or more datasets and the information associated with one or more states of the vehicle, one or more actions on the vehicle. The second control unit is further configured to control, based on one or more actions, the first control unit of the vehicle.
[006] In an embodiment, the second control unit is configured to process one or more datasets and the information associated with one or more states of the vehicle.
[007] In an embodiment, the second control unit is configured to determine, based on the one or more datasets and the information associated with one or more states of the vehicle, a first inference, the first inference being information corresponding to the one or more states of the vehicle.
[008] In an embodiment, the second control unit is configured to: calculate, a confidence level of the first inference; and transmit, based on the confidence level, the one or more actions to the first control unit of the vehicle.
[009] In an embodiment, the plurality of sensors is integrated with the portable electronic device.
[010] In an embodiment, the plurality of sensors is communicable coupled to the second control unit.
[011] In an embodiment, the plurality of sensors comprises at least one of camera, accelerometer, magnetometer and gyro-meter.
[012] In an embodiment, the second control unit is configured to receive one or more datasets from the plurality of sensors at pre-defined time intervals.
[013] In an embodiment, the plurality of sensors is configured to generate one or more datasets based on detection of an event change.
[014] In an embodiment, the portable electronic device is at least one of a personal digital assistant device, a computing device, a mobile device, a smart headset device and a smart watch.
[015] In an embodiment, the one or more actions being one of at least one of: an audio alert to the first control unit of the vehicle, a visual alert to the first control unit of the vehicle, and a communication including at least one of voice, call and text transmitted to the portable electronic device.
[016] In another aspect of the present invention, a method for controlling a vehicle is disclosed. The method has a step of receiving a signal indicative of one or more datasets from a plurality of sensors. The step of receiving is performed by the second control unit. The second control unit provided in a portable electronic device and configured to be communicatively coupled to a first control unit. The first control unit is provided in the vehicle. The method further has a step of receiving information associated with one or more states of the vehicle from the first control unit. The step of receiving information associated with one or more states of the vehicle is performed by the second control unit. The method further has a step of determining based on the one or more datasets and the information associated with one or more states of the vehicle, one or more actions on the vehicle. The step of determining is performed by the second control unit. The method further has a step of controlling based on one or more actions, the first control unit of the vehicle. The step of controlling is performed by the second control unit.
[017] In an embodiment, the method has the step of processing, by the second control unit, one or more datasets and the information associated with one or more states of the vehicle.
[018] In an embodiment, the method has the step of determining, by the second control unit, based on the one or more datasets and the information associated with one or more states of the vehicle, a first inference, the first inference being information corresponding to the one or more states of the vehicle.
[019] In an embodiment, the method has the step of calculating, by the second control unit, a confidence level of the first inference; and transmitting, based on the confidence level, the one or more actions to the first control unit of the vehicle.
[020] In an embodiment, the plurality of sensors being integrated with the portable electronic device.
[021] In an embodiment, the plurality of sensors being communicable coupled to the second control unit.
[022] In an embodiment, the plurality of sensors comprises at least one of camera, accelerometer, magnetometer and gyro-meter.
[023] In an embodiment, the method has the step of receiving, by second control unit, one or more datasets from the plurality of sensors at pre-defined time intervals.
[024] In an embodiment, the method has the step of generating, by the plurality of sensors, one or more datasets based on detection of an event change.
[025] In an embodiment, the portable electronic device being at least one of a personal digital assistant device, a computing device, a mobile device, a smart headset device and a smart watch.
[026] In an embodiment, the one or more actions being one of at least one of: an audio alert to the first control unit of the vehicle, a visual alert to the first control unit of the vehicle, and a communication including at least one of voice, call and text transmitted to the portable electronic device.

BRIEF DESCRIPTION OF THE DRAWINGS
[027] Reference will be made to embodiments of the invention, examples of which may be illustrated in accompanying figures. These figures are intended to be illustrative, not limiting. Although the invention is generally described in context of these embodiments, it should be understood that it is not intended to limit the scope of the invention to these particular embodiments.
Figure 1 illustrates a block diagram of a system for controlling a vehicle, in accordance with an embodiment of the present invention.
Figure 2 illustrates a flow diagram of a method for controlling a vehicle, in accordance with an embodiment of the present invention. Figure 3 illustrates a flow diagram of the method for controlling a vehicle, in accordance with an embodiment of the present invention.

DETAILED DESCRIPTION OF THE INVENTION
[028] Various features and embodiments of the present invention here will be discernible from the following further description thereof, set out hereunder.
[029] For the purpose of the present invention, the term “vehicle” includes bicycles, scooters, trikes, motorcycles, rickshaws, lorries, cars, trucks, buses etc. The term “vehicle” also includes electric vehicles, hybrid vehicles and conventional internal combustion engine vehicles.
[030] Figure 1 illustrates a block diagram of a system for controlling a vehicle, in accordance with an embodiment of the present invention. As shown in Figure 1, the system 100 comprises a portable electronic device 120 that comprises a wireless transceiver 122. The wireless transceiver 122 of the portable electronic device 120 is configured to facilitate wireless communication between the portable electronic device 120 and the vehicle 130. In a non-limiting example, the wireless communication may be one of Bluetooth, Wi-Fi, ZigBee, ANT, Ultra-wideband (UWB), Near-field communication (NFC), Radio Frequency (RF) and Low Frequency (LF). A first control unit 134 is provided in the vehicle 130. A second control unit 126 is provided in the portable electronic device 120. The second control unit 126 is configured to be communicatively coupled to the first control unit 134. In an embodiment, the wireless transceiver 122 facilitates a communication between the second control unit 126 of the portable electronic device 120 and the first control unit 134 of the vehicle 130 via radio frequency (RF) signals. The first control unit 134 is also referred to as a vehicle control unit. In an embodiment, the second control unit 126 communicates wirelessly with the first control unit 134. In a non-limiting example, the portable electronic device 120 can be a personal digital assistant (PDA) device configured to be communicatively coupled with the first control unit 134 of the vehicle 130. In another non-limiting example, the portable electronic device 120 can be one of a personal digital assistant device, a computing device, a mobile device, a smartphone, a smartwatch, a smart glove, a smart headset device and a smart ring of a user of vehicle 130, such as an owner or a rider, configured to be communicatively coupled with the first control unit 134. These examples should not be construed as limiting and the other now known or later developed portable electronic device 120 are well within the scope of the present invention.
[031] The portable electronic device 120 further comprises a plurality of sensors 124. The plurality of sensors 124 generates a signal indicative of one or more datasets. In an embodiment, the plurality of sensors 124 is integrated with the portable electronic device 120. In an embodiment, the plurality of sensors 124 comprises at least one of camera, accelerometer, magnetometer and gyro-meter. In an example, the camera captures video footage or photographs of events occurring on the road while the vehicle 130 is in motion or parked. These cameras are equipped with wide-angle lenses that capture a broad field of view, enabling them to record most of the road ahead and the areas adjacent to the vehicle 130. In an example, accelerometer sensor is a device that measures the acceleration or sudden change of motion of a vehicle 130. It is mainly used in safety applications, such as airbags, anti-lock braking systems, and traction control systems. It can also be used to detect vibrations, location, and inclination of the vehicle 130. In an example, the magnetometer determines the orientation of the vehicle 130 for providing accurate compass readings within the vehicle's navigation system. In an example, the gyro-meter measures the vehicle's angular rate or rotation around its axis, by detecting sudden changes in yaw (sideways motion) or roll (tilting), gyroscopes help in maintaining the stability of the vehicle 130 and adjust damping rates to improve ride comfort and handling. The plurality of sensors 124 are positioned at a first location associated with a left handlebar of the vehicle 130 (not shown), a second location associated with a right handlebar of the vehicle 130 (not shown), a third location associated with a right downward side of the vehicle 130 (not shown), a fourth location associated with a left downward side of the vehicle 130 (not shown), a fifth location associated with a fuel tank of the vehicle 130 (not shown) and a sixth location associated with a tail lamp of the vehicle 130 (not shown). In another embodiment, the plurality of sensors 124 may be auxiliary devices being communicable coupled to the second control unit 126. In an example, the auxiliary devices are home IoT devices, smart helmet or other smart peripheral devices carried by the user.
[032] The plurality of sensors 124 generate a signal indicative of one or more datasets. The plurality of sensors 124 is configured to generate one or more datasets based on detection of an event change in or around vehicle 130. In an example, the plurality of sensors 124 coupled to the portable electronic device 120 collects the raw information directly from the vehicle 130. In an embodiment, the portable electronic device 120 is mounted on a right side of a handlebar of the vehicle 130, the plurality of sensors 124 integrated with the portable electronic device 120 will capture information on the right side of the vehicle 130. In an example, the cameras in the portable electronic device 120 mounted on the right side of handlebar will capture traffic information on the right-hand side of the vehicle 130. In another example, the portable electronic device 120 mounted on the fuel tank of the vehicle 130 captures a vibration profile of the vehicle 130. In yet another example, the portable electronic device 120 mounted on a tail lamp assembly of the vehicle 130 captures the rearview of the vehicle 130.
[033] As shown in Figure 1, the portable electronic device 120 comprises the second control unit 126. The second control unit 126 is configured to receive the signal indicative of one or more datasets from the plurality of sensors 124. In an embodiment, the second control unit 126 being configured to receive one or more datasets from the plurality of sensors 124 at pre-defined time intervals. In an embodiment, the pre-defined time interval is between 1 to 10 seconds. The second control unit 126 of the portable electronic device 120 hosts an application to read and process the information received from the plurality of sensors 124 and the first control unit 134 of the vehicle 130. In an embodiment, the second control unit 126 of the portable electronic device 120 hosts an application in addition to a neural network model to process information received from the plurality of sensors 124 and the first control unit 134 of the vehicle 130.
[034] The second control unit 126 is further configured to receive information associated with one or more states of the vehicle 130 from the first control unit 134 of the vehicle 130. The second control unit 126 process the one or more datasets and the information associated with one or more states of the vehicle 130. The second control unit 126 determines the one or more actions on the vehicle 130 based on the one or more datasets and the information associated with one or more states of the vehicle 130. The second control unit 126 is configured to determine a first inference based on the one or more datasets and the information associated with one or more states of the vehicle 130. The first inference being information corresponding to the one or more states of the vehicle 130. In an example, the information processed by the second control unit 126 is fed to an inference and action engine of the second control unit 126 to filter most relevant information. The second control unit 126 is configured to calculate a confidence level of the first inference and transmit the one or more actions to the first control unit 134 of the vehicle 130 based on the confidence level. Then, the second control unit 126 controls the first control unit 134 of the vehicle 130 based on one or more actions. The vehicle 130 further comprises a vehicle mobility controller 136. The function of the vehicle mobility controller 136 is to manage and regulate various aspects of a vehicle's mobility, including its movement, speed, direction, and stability. The vehicle mobility controller 136 acts as a central component within the vehicle's control system, coordinating inputs from various sensors and user commands to ensure safe and efficient operation of the vehicle 130. In an embodiment, the second control unit 126 controls the first control unit 134 of the vehicle 130 based on one or more actions. The first control unit 134 of the vehicle 130 subsequently controls the vehicle mobility controller 136. In an embodiment, the one or more actions being one of at least one of: an audio alert to the first control unit 134 of the vehicle 130, a visual alert to the first control unit 134 of the vehicle 130, and a communication including at least one of voice, call and text transmitted to the portable electronic device 120.
[035] As shown in Figure 1, the system 100 further comprises the first control unit 134 in the vehicle 130. The vehicle 130 also includes a wireless transceiver 132. The wireless transceiver 132 facilitates a wireless communication between the first control unit 134 of the vehicle 130 and the second control unit 126 of the portable electronic device 120. The first control unit 134 of the vehicle 130 generate information associated with one or more states of the vehicle 130. The one or more states of the vehicle 130 includes information related to speed, orientation, yaw angle, pitch and other related information of the vehicle 130.
[036] In an example, a smartphone (PDA) is mounted near the headlamp of the vehicle 130. The vision sensor on the smartphone is used to gauge the environment around the vehicle 130. For example, obstacles around the vehicle 130, illumination around the vehicle 130 and road information.
When the rider is riding at 70 kmph (high speed) the illumination around the vehicle 130 is dim and it can be difficult for the rider to see around. Since, the rider is riding at a very high speed, the rider is unable to change the headlight throw. The vision sensor of the smartphone and the speed information from the vehicle 130 is used to understand the scenario using neural network models in the second control unit 126 of the PDA and arrive at an inference that the headlight throw needs to adapt as per the inputs from the plurality of sensors 124 of the PDA. Here the signal indicative of one or more datasets from a plurality of sensors 124 corresponds to the input from the vision sensor of the smartphone. The one or more states of the vehicle from the first control unit 134 corresponds to speed information of the vehicle 130. Thereafter, the neural network assigns confidence levels to its inferences. Only those adjustments or action with confidence levels above a pre-defined threshold are used to control motorcycle's headlight system for implementation in the vehicle. That is, only action with confidence levels above a pre-defined threshold is used to control the first control unit 134 for implementation in the vehicle 130.
[037] Therefore, the present invention incorporates the sensors present in the portable electronic device leading to reduced part count and provides enhanced vehicle control without manual intervention. The present invention provides safety solutions using the user peripheral devices, thus increases the vehicle computing power by extending the personal digital assistant’s sensors and computing resources This approach eliminates the need for integrating resource-heavy artificial intelligence (AI) applications into conventional vehicle control units, thereby ensuring enhanced safety without compromising system efficiency.
[038] Figure 2 illustrates a flow diagram of a method for controlling a vehicle, in accordance with an embodiment of the present invention.
[039] As shown, the method 200 comprises a step 202 of receiving a signal indicative of one or more datasets from a plurality of sensors 124. In an embodiment, the plurality of sensors 124 is integrated with a portable electronic device 120. In another embodiment, the plurality of sensors 124 of auxiliary device can be communicably coupled to the second control unit 126. In an example, the plurality of sensors 124 comprises at least one of camera, accelerometer, magnetometer and gyro-meter. The step 202 of receiving is performed by a second control unit 126. The second control unit 126 is provided in the portable electronic device 120. The second control unit 126 configured to be communicatively coupled to a first control unit 134. The first control unit 134 is provided in the vehicle 130. In a non-limiting example, the portable electronic device 120 can be a personal digital assistant device configured to be communicatively coupled with the first control unit 134. In an embodiment, the second control unit 126 receives the one or more datasets from the plurality of sensors 124 at pre-defined time intervals. The one or more datasets from a plurality of sensors 124 includes incoming traffic information, rear view information and other related information in and around vehicle 130. In an example, the plurality of sensors 124 includes a forward-facing camera that captures images of the road ahead to detect lane markings, traffic signs, pedestrians, and obstacles. In another example, the plurality of sensors 124 includes a rearview camera that provides a view of the area behind the vehicle 130 to assist with parking and reversing manoeuvres.
[040] The method 200 further comprises step 204 of receiving information associated with one or more states of the vehicle 130 from the first control unit 134. The step 204 of receiving is performed by the second control unit 126. The one or more states of the vehicle 130 includes information related to speed, orientation, yaw angle, pitch and other related information of the vehicle 130.
[041] The method 200 further comprises step 206 of determining one or more actions on the vehicle 130 based on the one or more datasets and the information associated with one or more states of the vehicle 130. The step 206 of determining is performed by the second control unit 126.
[042] The method further comprises step 208 of controlling the first control unit 134 of the vehicle 130 based on one or more actions. The step 208 of controlling is performed by the second control unit 126. In an embodiment, the one or more actions being one of at least one of: an audio alert to the first control unit 134 of the vehicle 130, a visual alert to the first control unit 134 of the vehicle 130, and a communication including at least one of voice, call and text transmitted to the portable electronic device 120.
[043] In an example, the system 100 refines the data from the plurality of sensors 124 by applying post-processing techniques. This involves removing false positives, or filtering based on specific criteria. For example, for object detection size and shape is used. A confidence level or score is given to the processed data to indicate the reliability according to the feature it pertains to. This helps in decision-making processes and ensuring accurate inferences. Conventional techniques are used to generate information from the plurality of sensors 124. Machine learning-based approaches, such as deep learning, results in object detection tasks and can be utilized to enhance the accuracy and robustness of the system 100.
[044] In another example, if the smartphone is placed above the headlamp, the camera of the smartphone captures the information of the user. If the user is wearing helmet, then the smartphone acts as a keyless mechanism to enable unlocking of the vehicle 130. In this scenario, object detection will require post-processing techniques and data validation techniques. For object detection, a confidence level or a score for the detected helmet is calculated to indicate the reliability of the detection. This helps in decision-making processes for the system 100 and ensure whether the user has worn a helmet or not accurately.
[045] Figure 3 illustrates a flow diagram of a method for controlling a vehicle, in accordance with an embodiment of the present invention. As shown, the method 300 comprises a step 302 of monitoring the plurality of sensors 124 from the portable electronic device 120 for significant changes or anomaly. The plurality of sensors 124 comprises at least one of camera, accelerometer, magnetometer and gyro-meter. The portable electronic device 120 being at least one of a personal digital assistant device, a computing device, a mobile device, a smart headset device and a smart watch.
[046] The wireless communication channel between the portable electronic device 120 and the vehicle 130 may be established via Bluetooth, Wi-Fi, infrared, cellular network, and other forms of wireless communication media. At step 304, a second control unit 126 reads one or more sensor inputs from the portable electronic device 120 at regular intervals if a change in event is detected. In an embodiment, the plurality of sensors 124 generates a signal indicative of one or more datasets. In an example, the plurality of sensors 124 is integrated with the portable electronic device 120. In another example, the plurality of sensors is communicably coupled to the second control unit 126 of the portable electronic device 120. In an example, the plurality of sensors 124 includes a forward-facing camera that captures images of the road ahead to detect lane markings, traffic signs, pedestrians, and obstacles. In another example, the plurality of sensors 124 includes a rearview camera that provides a view of the area behind the vehicle to assist with parking and reversing manoeuvres. At step 306, the second control unit 126 receives one or more vehicle states from the first control unit 134. The one or more states of the vehicle 130 includes information related to speed, orientation, yaw angle, pitch and other related information of the vehicle 130.
[047] At step 308, signal indicative of one or more datasets from the plurality of sensors 124 and the one or more vehicle states from the first control unit 134 of the vehicle 130 is fed to the second control unit 126 of the portable electronic device 120. At step 310, the second control unit 126 derives the inference. At step 312, the second control unit 126 reads the inference at regular intervals. The second control unit 126 computes confidence level of inference. At step 314, the second control unit 126 monitors the confidence level of inferences. Upon receiving the confidence level of inferences, at step 316, the second control unit 126 determines the vehicular actions based on the calculated inferences and vehicle states. Only vehicular action with confidence levels above a pre-defined threshold is used to control the first control unit 134 for implementation in the vehicle 130.
[048] At step 318, the second control unit 126 communicates the determined vehicular action to the first control unit 134 of the vehicle 130. The present invention provides safety solutions using the user peripheral devices, thus increases the vehicle computing power by extending the personal digital assistant’s sensors and computing resources.
[049] The claimed features/method steps of the present invention as discussed above are not routine, conventional, or well understood in the art, as the claimed features/steps enable the following solutions to the existing problems in conventional technologies.
[050] Further, one or more computer-readable storage media may be utilized in implementing embodiments consistent with the present disclosure. A computer-readable storage medium refers to any type of physical memory on which information or data readable by a processor may be stored. Thus, a computer-readable storage medium may store instructions for execution by one or more processors, including instructions for causing the processor(s) to perform steps or stages consistent with the embodiments described herein. The term “computer-readable medium” should be understood to include tangible items and exclude carrier waves and transient signals, i.e., be non-transitory. Examples include random access memory (RAM), read-only memory (ROM), volatile memory, non-volatile memory, hard drives, CD ROMs, DVDs, flash drives, disks, and any other known physical storage media.
[051] By utilizing the sensors in the portable electronic device and computing performed by the portable electronic device, the present invention enhances the vehicle's performance without manual intervention. Further, integrating PDAs into vehicle control systems offers a streamlined assembly process as compared to traditional methods requiring additional hardware components. This simplification leads to reduced manufacturing costs. Moreover, the incorporation of PDAs for vehicle control and safety functions leads to a reduction in the number of individual components needed within the vehicle's systems. This reduction in parts can simplify assembly, decrease maintenance requirements, and potentially lower manufacturing costs. Moreover, by extending the capabilities of vehicle controllers with PDA sensors, the present invention enhances overall safety of the user by providing real-time monitoring, detection, and intervention capabilities. The present invention provides safety solutions using the user peripheral devices, thus increases the vehicle computing power by utilizing the sensors and computing resources of the portable electronic device. This approach of the present invention eliminates the need for integrating resource-heavy artificial intelligence (AI) applications into conventional vehicle control units, thereby ensuring enhanced safety without compromising system efficiency.
[052] While the present invention has been described with respect to certain embodiments, it will be apparent to those skilled in the art that various changes and modification may be made without departing from the scope of the invention as defined in the following claims.

List of Reference Numerals

100- System for Controlling a Vehicle
120- Portable Electronic Device
122- Wireless Transceiver
124- Plurality of Sensors
126- Second Control Unit
130- Vehicle
132- Wireless Transceiver
134- First Control Unit
136- Vehicle Mobility Controller
200, 300- Method for Controlling a Vehicle , Claims:WE CLAIM:

1. A system (100) for controlling a vehicle (130), the system (100) comprising:
a first control unit (134), the first control unit (134) provided in the vehicle (130); and
a second control unit (126), the second control unit (126) provided in a portable electronic device (120) and configured to be communicatively coupled to the first control unit (134), the second control unit (126) configured to:
- receive, a signal indicative of one or more datasets from a plurality of sensors (124);
- receive, information associated with one or more states of the vehicle (130) from the first control unit (134);
- determine, based on the one or more datasets and the information associated with one or more states of the vehicle (130), one or more actions on the vehicle (130); and
- control, based on one or more actions, the first control unit (134) of the vehicle (130).

2. The system (100) as claimed in claim 1, wherein the second control unit (126) being configured to process one or more datasets and the information associated with one or more states of the vehicle (130).
3. The system (100) as claimed in claim 1, wherein the second control unit (126) being configured to determine, based on the one or more datasets and the information associated with one or more states of the vehicle (130), a first inference, the first inference being information corresponding to the one or more states of the vehicle (130).

4. The system (100) as claimed in claim 3, wherein the second control unit (126) being configured to: calculate, a confidence level of the first inference; and transmit, based on the confidence level, the one or more actions to the first control unit (134) of the vehicle (130).

5. The system (100) as claimed in claim 1, wherein the plurality of sensors (124) being integrated with the portable electronic device (120).

6. The system (100) as claimed in claim 1, wherein the plurality of sensors (124) being communicable coupled to the second control unit (126).

7. The system (100) as claimed in claim 1, wherein the plurality of sensors (124) comprises at least one of camera, accelerometer, magnetometer and gyro-meter.

8. The system (100) as claimed in claim 1, wherein second control unit (126) being configured to receive, the one or more datasets from the plurality of sensors (124) at pre-defined time intervals.
9. The system (100) as claimed in claim 1, wherein the plurality of sensors (124) being configured to generate one or more datasets based on detection of an event change.

10. The system (100) as claimed in claim 1, wherein the portable electronic device (120) being at least one of a personal digital assistant device, a computing device, a mobile device, a smart headset device and a smart watch.

11. The system (100) as claimed in claim 1, wherein the one or more actions being one of at least one of: an audio alert to the first control unit (134) of the vehicle (130), a visual alert to the first control unit (134) of the vehicle (130), and a communication including at least one of voice, call and text transmitted to the portable electronic device (120).

12. A method (200) for controlling a vehicle (130), a first control unit (134) provided in the vehicle (130), the method (200) comprising the steps of:
- receiving, by a second control unit (126), a signal indicative of one or more datasets from a plurality of sensors (124), the second control unit (126) provided in a portable electronic device (120) and configured to be communicatively coupled to the first control unit (134);
- receiving, by the second control unit (126), information associated with one or more states of the vehicle (130) from the first control unit (134);
- determining, by the second control unit (126), based on the one or more datasets and the information associated with one or more states of the vehicle (130), one or more actions on the vehicle (130); and
- controlling, by the second control unit (126), based on one or more actions, the first control unit (134) of the vehicle (130).

13. The method (200) as claimed in claim 12, wherein method (200) comprises the step of processing, by the second control unit (126), one or more datasets and the information associated with one or more states of the vehicle (130).

14. The method (200) as claimed in claim 12, wherein the method (200) comprises the step of determining, by the second control unit (126), based on the one or more datasets and the information associated with one or more states of the vehicle (130), a first inference, the first inference being information corresponding to the one or more states of the vehicle (130).

15. The method (200) as claimed in claim 14, wherein the method (200) comprises the step of calculating, by the second control unit (126), a confidence level of the first inference; and transmitting, based on the confidence level, the one or more actions to the first control unit (134) of the vehicle (130).

16. The method (200) as claimed in claim 12, wherein the plurality of sensors (124) being integrated with the portable electronic device (120).

17. The method (200) as claimed in claim 12, wherein the plurality of sensors (124) being communicable coupled to the second control unit (126).

18. The method (200) as claimed in claim 12, wherein the plurality of sensors (124) comprises at least one of camera, accelerometer, magnetometer and gyro-meter.

19. The method (200) as claimed in claim 12, wherein the method (200) comprises the step of receiving, by second control unit (126), one or more datasets from the plurality of sensors (124) at pre-defined time intervals.

20. The method (200) as claimed in claim 12, wherein the method (200) comprises the step of generating, by the plurality of sensors (124), one or more datasets based on detection of an event change.
21. The method (200) as claimed in claim 12, wherein the portable electronic device (120) being at least one of a personal digital assistant device, a computing device, a mobile device, a smart headset device and a smart watch.

22. The method (200) as claimed in claim 12, wherein the one or more actions being one of at least one of: an audio alert to the first control unit (134) of the vehicle (130), a visual alert to the first control unit (134) of the vehicle (130), and a communication including at least one of voice, call and text transmitted to the portable electronic device (120).

Dated this 20th day of March 2024
TVS MOTOR COMPANY LIMITED
By their Agent & Attorney


(Nikhil Ranjan)
of Khaitan & Co
Reg No IN/PA-1471

Documents

Application Documents

# Name Date
1 202441021224-STATEMENT OF UNDERTAKING (FORM 3) [20-03-2024(online)].pdf 2024-03-20
2 202441021224-REQUEST FOR EXAMINATION (FORM-18) [20-03-2024(online)].pdf 2024-03-20
3 202441021224-PROOF OF RIGHT [20-03-2024(online)].pdf 2024-03-20
4 202441021224-POWER OF AUTHORITY [20-03-2024(online)].pdf 2024-03-20
5 202441021224-FORM 18 [20-03-2024(online)].pdf 2024-03-20
6 202441021224-FORM 1 [20-03-2024(online)].pdf 2024-03-20
7 202441021224-FIGURE OF ABSTRACT [20-03-2024(online)].pdf 2024-03-20
8 202441021224-DRAWINGS [20-03-2024(online)].pdf 2024-03-20
9 202441021224-DECLARATION OF INVENTORSHIP (FORM 5) [20-03-2024(online)].pdf 2024-03-20
10 202441021224-COMPLETE SPECIFICATION [20-03-2024(online)].pdf 2024-03-20
11 202441021224-Proof of Right [05-07-2024(online)].pdf 2024-07-05