Abstract: ABSTRACT System and Method for Estimating Centre of Mass of a Two-wheeled Vehicle with One or More Users The present disclosure provides a system (100) for estimating centre of mass (C) of two-wheeled vehicle (102) with one or more users (108, 110). The system (100) comprising control unit (118) communicably coupled with first sensor (104a, 104b), user-data sensors (114) and vehicle-data sensors (116). The control unit (118) is configured to determine, head centre of mass (Chr, Chp) and posture data for each of users (108, 110). Thereafter, orientation of the vehicle (102) with respect to vertical plane (Y-Y’) is determined. Thereafter, control unit 118 estimates centre of mass (C) of the vehicle based on head centre of mass (Chr, Chp), posture data of each user (108, 110), operating parameters and orientation of the vehicle (102). The system (100) computes centre of mass (C) of two-wheeled vehicle (102) with one or more users (108, 110). Reference Figure 1
Description:FORM 2
THE PATENTS ACT, 1970
(39 OF 1970)
&
THE PATENTS RULES, 2003
COMPLETE SPECIFICATION
[See section 10, Rule 13]
TITLE OF INVENTION
System and Method for Estimating Centre of Mass of a Two-wheeled Vehicle with One or More Users
APPLICANT
TVS MOTOR COMPANY LIMITED, an Indian company, having its address at “Chaitanya”, No.12 Khader Nawaz Khan Road, Nungambakkam, Chennai 600 006, Tamil Nadu, India.
PREAMBLE TO THE DESCRIPTION
The following specification particularly describes the invention and the manner in which it is to be performed.
FIELD OF THE INVENTION
[001] The present invention generally relates to a two-wheeled vehicle. More particularly, the present invention relates to a system and a method for estimating centre of mass of a two-wheeled vehicle with one or more users.
BACKGROUND OF THE INVENTION
[002] Motorcycles are typically characterized by a large rider-to-vehicle mass ratio, making estimation of Centre of Mass (CoM) along with a rider and/or a pillion rider, especially relevant. This is due to the fact that, any change in the position of the rider and/or the pillion rider affects the stability of the motorcycle or a rider-vehicle system. As such, a system that focuses on rider safety and/or stability control of the motorcycle necessitates the need for considering parameters pertaining to CoM of the rider and/or the pillion rider. Also, while considering the vehicle and the rider along with the pillion rider, the system becomes even more complex as the behavior of the pillion rider may differ from that of the rider. This is due to the fact that, the pillion rider cannot always predict what maneuver the rider wishes to perform. As such, CoM of the system is dependent on not only the rider’s centre of mass but also on the pillion rider.
[003] Additionally, information pertaining to a rate of change of throttle position, a vehicle speed, a lean angle while cornering, etc., enables to build a strong rider characterization model, that can be used for tuning parameters of the motorcycle to suit driving style of the rider.
[004] Advent of modern technologies have provided with systems which determine body posture for determining the CoM of the rider. One such method is to provide a plurality of sensors on the body of the rider. However, mounting of plurality of sensors on the body of the rider renders the process cumbersome, messy, and expensive. Another method for determining CoM of the rider is use of a camera system. However, the camera system is incapable of determining a posture and the CoM of the rider and the pillion rider simultaneously.
[005] In view of the above, there is a need for a system and a method for estimating centre of mass of a two-wheeled vehicle with one or more users, which addresses one or more limitations stated above.
SUMMARY OF THE INVENTION
[006] In one aspect, a system for estimating centre of mass of a two-wheeled vehicle with one or more users. The system comprises at least one first sensor disposed in a helmet worn by each of the one or more users of the two-wheeled vehicle. The one or more users comprises at least one of rider and a pillion-rider of the two-wheeled vehicle, wherein the at least one first sensor is adapted to generate head movement data of the each of the one or more users. A vehicle-tilt sensor is disposed in the two-wheeled vehicle. The vehicle-tilt sensor is adapted to monitor orientation of the two-wheeled vehicle with respect to a vertical plane Y-Y’ of the two-wheeled vehicle. One or more user-data sensors are disposed in the two-wheeled vehicle. The one or more user-data sensors are adapted to procure a position-related data of the each of the one or more users. Further, one or more vehicle-data sensors are disposed in the two-wheeled vehicle for monitoring operating parameters of the two-wheeled vehicle. A control unit is disposed in the two-wheeled vehicle and communicably coupled with the at least one first sensor, the one or more user-data sensors and the one or more vehicle-data sensors. The control unit is configured to receive the head movement data of each of the one or more users from each of the at least one first sensor, the position-related data from each of the one or more user-data sensors, the operating parameters of the two-wheeled vehicle from the one or more vehicle-data sensors and the orientation of the two-wheeled vehicle from the vehicle-tilt sensor. The control unit thereafter determines a head centre of mass for each of the one or more users of the two-wheeled vehicle based on the data received from each of the at least one first sensor. The posture data of each of the one or more users is then determined based on the position-related data received from the one or more user-data sensors. The orientation of the two-wheeled vehicle with respect to the vertical plane is then determined based on the data received from the vehicle-tilt sensor. Subsequently, the centre of mass of the two-wheeled vehicle with the one or more users is determined based on the head centre of mass of each of the one or more users, the posture data of each of the one or more users and the orientation of the two-wheeled vehicle.
[007] In an embodiment, the one or more user-data sensors comprise of at least one force sensor and a load sensor. The at least one force sensor is disposed on each of a foot peg of the two-wheeled vehicle, the at least one force sensor being adapted to monitor force exerted by each of the one or more users on each of the foot peg of the two-wheeled vehicle. The load sensor is disposed below a seat of the two-wheeled vehicle. The load sensor is adapted to monitor weight exerted on the seat of the two-wheeled vehicle.
[008] In an embodiment, the load sensor is adapted to monitor load exerted on the seat based on at least one of weight of each of the one or more users seated on the two-wheeled vehicle and a rear mounted load disposed on the two-wheeled vehicle.
[009] In an embodiment, the one or more vehicle-data sensors comprise a throttle-position sensor adapted to monitor a rate of change of a throttle position in a throttle body of the two-wheeled vehicle and a speed sensor adapted to monitor speed of the two-wheeled vehicle.
[010] In an embodiment, the control unit is adapted to preprocess the head movement data of each of the one or more users, the position-related data of each of the one or more users, the operating parameters of the two-wheeled vehicle and the orientation of the two-wheeled vehicle by sampling and prefiltering the received head movement data, the position-related data, the operating parameters, and the orientation of the two-wheeled vehicle at a pre-defined sampling rate.
[011] In an embodiment, the control unit upon filtering, is configured to fuse the head movement data of each of the one or more users, through a sensor-fusion technique for determining orientation of the head of each of the one or more users with respect to a ground surface.
[012] In an embodiment, wherein the control unit is configured to determine orientation of the head of each of the one or more users with respect to the two-wheeled vehicle, based on the determined orientation of the head of each of the one or more users with respect to the ground surface and the orientation of the two-wheeled vehicle.
[013] In an embodiment, the control unit is configured to determine the posture data of each of the one or more users through a posture determining model based on the head movement data of each of the one or more users and force exerted by each of the one or more users on each foot peg from one or more force sensors of the one or more user-data sensors.
[014] In an embodiment, the control unit is configured to determine a centre of mass of each of the one or more users based on the posture data of each of the one or more users and a load data received from a load sensor of the one or more user-data sensors.
[015] In an embodiment, the control unit is configured to determine the centre of mass of the two-wheeled vehicle based on the centre of mass of each of the one or more users and the orientation of the two-wheeled vehicle determined by the vehicle-tilt sensor.
[016] In an embodiment, the control unit is configured to control the operating parameters of the two-wheeled vehicle based on the estimated centre of mass of the two-wheeled vehicle.
[017] In another aspect, a method for estimating the centre of mass of the two-wheeled vehicle with the one or more users is depicted. The method comprises receiving the head movement data of each of the one or more users by the control unit from each of the at least one first sensor, the position-related data from each of the one or more user-data sensors, the operating parameters of the two-wheeled vehicle from the one or more vehicle-data sensors and the orientation of the two-wheeled vehicle from the vehicle-tilt sensor. The control unit thereafter determines the head centre of mass for each of the one or more users of the two-wheeled vehicle based on the data received from each of the at least one first sensor. The posture data of each of the one or more users is then determined based on the position-related data received from the one or more user-data sensors. The orientation of the two-wheeled vehicle with respect to the vertical plane is then determined based on the data received from the vehicle-tilt sensor. Subsequently, the centre of mass of the two-wheeled vehicle is determined based on the head centre of mass of each of the one or more users, the posture data of each of the one or more users, the operating parameters of the two-wheeled vehicle and the orientation of the two-wheeled vehicle through an estimation model.
BRIEF DESCRIPTION OF THE DRAWINGS
[018] Reference will be made to embodiments of the invention, examples of which may be illustrated in accompanying figures. These figures are intended to be illustrative, not limiting. Although the invention is generally described in context of these embodiments, it should be understood that it is not intended to limit the scope of the invention to these particular embodiments.
Figure 1 is a block diagram of a system for estimating a centre of mass of the two-wheeled vehicle with one or more users, in accordance with an exemplary embodiment of the present disclosure.
Figure 2 is a schematic side view of a two-wheeled vehicle depicting a rider and a load, in accordance with an exemplary embodiment of the present invention.
Figure 3 is a schematic side view of the two-wheeled vehicle depicting the rider and a pillion rider, in accordance with an exemplary embodiment of the present disclosure.
Figure 4 is a block diagram of a control unit of the system, in accordance with exemplary embodiment of the present disclosure.
Figure 5a is a graphical representation depicting variation in angular velocity with respect to time determined by at least one first sensor disposed in a helmet worn by each of one or more users of the two-wheeled vehicle, in accordance with an exemplary embodiment of the present disclosure.
Figure 5b is a graphical representation depicting variation in acceleration with respect to time determined by the at least one first sensor disposed in the helmet worn by each of the one or more users of the two-wheeled vehicle, in accordance with an exemplary embodiment of the present disclosure.
Figure 6 is a graphical representation depicting orientation with respect to time filtered by the control unit based on the data received from the at least one sensor, in accordance with an exemplary embodiment of the present disclosure.
Figure 7 is a graphical representation of a centre of mass distribution determined by the control unit, in accordance with an exemplary embodiment of the present disclosure.
Figure 8 is a graphical representation of a centre of mass distribution determined by the control unit, in accordance with an exemplary embodiment of the present disclosure.
Figure 9 is a flow diagram depicting a method for estimating centre of mass of the rider, in accordance with an exemplary embodiment of the present disclosure.
Figure 10 is a flow diagram depicting a method for estimating centre of mass of the pillion rider, in accordance with an exemplary embodiment of the present disclosure.
Figure 11 is a flow diagram depicting a method for estimating the centre of mass of the two-wheeled vehicle including the rider and the pillion rider, in accordance with an exemplary embodiment of the present disclosure.
Figure 12 is a flow diagram depicting a method for estimating the centre of mass of the two-wheeled vehicle with the one or more users, in accordance with an exemplary embodiment of the present disclosure.
DETAILED DESCRIPTION OF THE INVENTION
[019] Various features and embodiments of the present invention here will be discernible from the following further description thereof, set out hereunder.
[020] Figure 1 is a block diagram of a system 100 for estimating centre of mass C of the two-wheeled vehicle 102 with one or more users 108, 110, in accordance with an exemplary embodiment of the present invention. The system 100 is adapted to estimate the centre of mass C (shown in Figures 2 and 3) of the two-wheeled vehicle 102 (hereinafter referred to as ‘vehicle 102’) upon determining posture as well as a centre of mass of the one or more users 108, 110 (shown in Figure 3) (hereinafter referred to as ‘users 108, 110’) of the vehicle 102 and/or a rear mounted load 120 (as shown in Figure 2) for accurate determination of the centre of mass C of the vehicle 102. Also, determination of posture as well as the centre of mass of the users 102, 110 enables to understand behavior of a rider 108 and vehicle operating regions, thereby enabling the system 100 to control vehicle operating parameters for providing a better riding experience as well as enhance safety control in the vehicle 102.
[021] Referring to Figures 2 and 3 in conjunction with Figure 1, the system 100 comprises at least one first sensor 104a, 104b (hereinafter interchangeably referred to as ‘first sensor 104a, 104b’) disposed in a helmet 106 worn by each of the users 108, 110 of the vehicle 102. The first sensor 104a, 104b is adapted to generate head movement data of the each of the users 108, 110. In an embodiment, head movement data may pertain to rotation or tilt of head (not shown) of each of the users 108, 110. In another embodiment, each of the first sensor 104a, 104b is an inertial measurement unit (IMU) or a gyroscope or an accelerometer adapted to monitor rotation or tilt of the head of each of the users 108, 110 for generating the head movement data. In an embodiment, each of the first sensor 104a, 104b is adapted to determine the head movement data based on the rotation or tilt of head of each of the users 108, 110 about a vertical plane Y-Y’ (shown in Figure 2) of the vehicle 102. In an embodiment, the head movement data determined by the first sensor 104a, 104b is three-dimensional as depicted in Figures 5a and 5b, wherein a rotational angular velocity and an acceleration in X, Y, and Z co-ordinates during tilting or orientation of the head is considered at various instances of time.
[022] In the present embodiment, the users 108, 110 comprises at least one of the rider 108 and a pillion-rider 110 of the vehicle 102. Accordingly, a first sensor 104a is provided in the helmet 106 worn by the rider 108 and a first sensor 104b is provided in the helmet 106 worn by the pillion rider 110.
[023] Further, a vehicle-tilt sensor 112 is disposed in the vehicle 102. The vehicle-tilt sensor 112 is adapted to monitor orientation of the vehicle 102 with respect to the vertical plane Y-Y’ of the two-wheeled vehicle 102. In an embodiment, the vehicle-tilt sensor 112 monitors the orientation based on tilting of the vehicle 102 with respect to the vertical plane Y-Y’. In an embodiment, the vehicle-tilt sensor 112 is an inertial measurement unit (IMU) or a gyroscope or an accelerometer adapted to monitor orientation of the vehicle 102. The IMU may be a 6 to 9 degree of freedom IMU.
[024] Additionally, the system 100 comprises one or more user-data sensors 114 (hereinafter referred to as ‘user-data sensors 114’) disposed in the vehicle 102. The user-data sensors 114 are adapted to procure a position-related data of the each of the users 108, 110. In an embodiment, the position-related data pertains to position or stance of each user 108, 110 on the vehicle 102 such as a right-side inclination or a left-side inclination of each user 108, 110 on the vehicle 102. In the present embodiment, the user-data sensors 114 comprises at least one force sensor 114a and a load sensor 114b.
[025] In an embodiment, the at least one force sensor 114a (hereinafter referred to as ‘force sensor 114a’) is disposed on each of a foot peg (not shown in Figures) of the vehicle 102. The force sensor 114a is adapted to monitor the force or pressure exerted by foot (not shown) of the user 108, 110 for generating or procuring the position-related data. In an embodiment, four force sensors 114a are provided on each of the four foot pegs provided on the vehicle 102. Accordingly, two force sensors 114a are provided on a front-right foot peg (not shown) and a front-left foot peg (not shown) of the vehicle 102, for procuring the position-related data of the rider 108. Thus, based on the force or pressure applied by the rider 108 on his right leg or left leg, the force sensor 114a determines the position-related data of the rider 108. Also, two force sensors 114a are provided on rear-right foot peg (not shown) and a rear-left foot peg (not shown) of the vehicle 102, for procuring the position-related data of the pillion rider 110. As such, based on the force or pressure applied by the pillion rider 110 on his right leg or left leg, the force sensor 114a determines the position-related data of the pillion rider 110. In an embodiment, each of the force sensors 114a mounted on the vehicle 102 is a strain sensor or a strain gauge.
[026] In an embodiment, the load sensor 114b is disposed below a seat (not shown) and may be positioned towards a longitudinal rear end of the vehicle 102. The load sensor 114b is adapted to monitor weight exerted on the seat of the vehicle 102. As such, the load sensor 114b is adapted to monitor weight of the users 108, 110. The load sensor 114b is also adapted to monitor weight of a rear mounted load 120 (shown in Figure 2) on the vehicle 102. In an embodiment, the rear mounted load 120 may be mounted with or without the pillion rider 110. In an embodiment, the load sensor 114b may be a strain sensor. In an embodiment, if only the rider 108 is seated on the vehicle 102, the load sensor 114b determines the load or weight exerted on the seat. The load sensor 114b and the force sensor 114a may generate electrical signal output indicative of the mass of the rider 108 and/or the pillion rider 110 and the force exerted by the rider 108 and/or the pillion rider 110 on the foot peg, respectively. As an example, the mass of the rider 108 is determined as 75 kg.
[027] The system 100 further comprises one or more vehicle-data sensors 116 (hereinafter referred to ‘vehicle-data sensors 116’) disposed in the vehicle 102. The vehicle-data sensors 116 are adapted to monitor operating parameters of the vehicle 102. In the present embodiment, the vehicle-data sensors 116 comprise a throttle-position sensor 116a and a speed sensor 116b. In an embodiment, the throttle-position sensor 116a is mounted on a throttle body (not shown) of the vehicle 102. The throttle-position sensor 116a is adapted to monitor a rate of change of a throttle position in the throttle body of the vehicle 102. In an embodiment, the speed sensor 116b is connected to a wheel (not shown) of the vehicle 102. The speed sensor 116b is adapted to monitor speed of the vehicle 102.
[028] Referring to Figure 4 in conjunction with Figures 1-3, the system 100 comprises a control unit 118 disposed in the vehicle 102. The control unit 118 is communicably coupled with the first sensors 104a, 104b, the user-data sensors 114, and the vehicle-data sensors 116. In an embodiment, the control unit 118 is communicably coupled to the first sensors 104a, 104b, the user-data sensors 114, and the vehicle-data sensors 116 through a wired connection or a wireless connection, as per design feasibility and requirement. The control unit 118 is adapted to estimate the centre of mass C of the vehicle 102, based on the data procured by the first sensors 104a, 104b, the user-data sensors 114, and the vehicle-data sensors 116.
[029] In an embodiment, the control unit 118 can be in communication with at least one vehicle control unit (not shown) of the vehicle 102. Accordingly, the control unit 118 may obtain data pertaining to the speed of the vehicle 102, the throttle position, and/or the orientation of the vehicle 102, from the at least one vehicle control unit. In an embodiment, the control unit 118 may comprise one or more additional components such as, but not limited to, an input/output module 122, a processing module 126 with an analytic module 128.
[030] In an embodiment, the control unit 118 may be embodied as a multi-core processor, a single core processor, or a combination of one or more multi-core processors and one or more single core processors. For example, the control unit 118 is embodied as one or more of various processing devices or modules, such as a coprocessor, a microprocessor, a controller, a digital signal processor (DSP), a processing circuitry with or without an accompanying DSP, or various other processing devices including integrated circuits such as but not limited to, for example, an application specific integrated circuit (ASIC), a field programmable gate array (FPGA), a microcontroller unit (MCU), a hardware accelerator, a special-purpose computer chip, or the like. In yet another embodiment, the control unit 118 may be configured to execute hard-coded functionality. In still another embodiment, the control unit 118 may be embodied as an executor of instructions, where the instructions are specifically configured to the control unit 118 to perform steps or operations described herein for estimating the centre of mass C of the vehicle 102.
[031] Further, the processing module 126 is communicably coupled to a memory unit 124. The memory unit 124 is capable of storing information processed by the processing module 126 for determining the centre of mass C of the vehicle 102 and the data procured by the control unit 118 from the first sensors 104a, 104b, the user-data sensors 114, and the vehicle-data sensors 116. In an embodiment, the memory unit 124 may be external to the control unit 118.
[032] In an embodiment, the memory unit 124 is embodied as one or more volatile memory devices, one or more non-volatile memory devices and/or combination thereof, such as magnetic storage devices, optical-magnetic storage devices and the like as per design feasibility and requirement. The memory unit 124 communicates with the control unit 118 or the processing module 126 via suitable interfaces such as Advanced Technology Attachment (ATA) adapter, a Serial ATA [SATA] adapter, a Small Computer System Interface [SCSI] adapter, a network adapter or any other component enabling communication between the memory unit 124 and the control unit 118 or the processing module 126. In an embodiment, the control unit 118 may be connected to a power supply such as a battery module (not shown) of the vehicle, for receiving electrical power. In an embodiment, the control unit 118 may have an inbuilt power supply 130 for drawing power from the battery module of the vehicle 102.
[033] In an embodiment, the control unit 118 or the analytic module 128 of the processing module 126 of the control unit 118 is adapted to estimate the centre of mass C of the vehicle 102 based on the data procured by the first sensors 104a, 104b, the user-data sensors 114, and the vehicle-data sensors 116. That is, the control unit 118 or the analytic module 128 estimates the centre of mass C of the vehicle 102 based on the head movement data (as shown in Figures 5a and 5b) of each of the users 108, 110 received from each of the first sensors 104a, 104b, the position-related data from each of the user-data sensors 114, operating parameters of the vehicle 102 from the vehicle-data sensors 116, and orientation of the vehicle 102 from the vehicle-tilt sensor 112.
[034] In an embodiment, the control unit 118 determines, a head centre of mass Chr, Chp (also shown in Figures 7-10) for each of the users 108, 110 of the vehicle 102 based on the data received from the first sensors 104a, 104b. In an embodiment, the Chr is indicative of head centre of mass of the rider 108, while Chp is indicative of head centre of mass of the pillion rider 110. In an embodiment, the control unit 118 determines the Chr and/or Chp based on presence of the rider 108 and/or the pillion rider 110 on the vehicle 102.
[035] The control unit 118 determines the head centre of mass Chr, Chp by firstly determining the head orientation of the users 108, 110 (i.e. the head orientation of the pillion 108 and/or the pillion rider 110). The control unit 118 determines the head orientation with respect to a ground surface (not shown) by fusing or combining the head movement data (shown in Figures 5a and 5b) received from the first sensors 104a, 104b. In an embodiment, when the head movement data is from the IMUs, the control unit 118 performs fusion of the head movement data in X, Y, Z co-ordinates. In an embodiment, the control unit 118 fuses the head movement data through a sensor-fusion technique. The control unit 118 determines the head orientation of the users 108, 110 with respect to the ground surface, that is the angle formed by the head of the rider 108 and the pillion rider 110 with respect X, Y, Z -axis in 3D space. In an embodiment, the control unit 118 determines the orientation of the vehicle 102 with respect to the vertical plane Y-Y’ and based on the data received from the vehicle-tilt sensor 112. Thereafter, the control unit 118 determines orientation of the head of the users 108, 110 with respect to the vehicle 102 (or the vertical plane Y-Y’) based on the head orientation with respect to the ground surface and orientation of the vehicle 102 determined from the vehicle-tilt sensor 112. Based on the head orientation of the users 108, 110 with respect to the vehicle 102, the head centre of mass Chr, Chp is determined.
[036] The control unit 118 also determines the posture data of each of the users 108, 110 based on the position-related data received from the user-data sensors 114, the head movement data. The posture data pertains to the orientation or tilt of a torso (not shown), legs (not shown), and hands (not shown) of the users 108, 110 determined using the data procured by the user-data sensors 114 and the head movement data. In an embodiment, the control unit 118 is configured to determine the posture data of users 108, 110 through a posture determining model based on the head movement data and data procured by the user-data sensors 114. In an embodiment, the posture determining model may be a data driven model or a mathematical model or a machine learning model.
[037] In an embodiment, since the head orientation in 3D space of the users 108, 110 is determined from the erect position, based on the interrelation between the torso orientation and the head orientation of the users 108, 110, the control unit 118 may determine the orientation of the torso of the rider 108 and the pillion rider 110. For example, if the head orientation is determined to be tilted by 20 degrees from the erect position, the control unit 118 can determine the torso orientation to be 24 degrees from the erect position. Further, from the data from the user data sensors 114 and the orientation of the torso, the control unit 118 determines orientation of the legs and hands of the rider 108 and/or the pillion rider 110.
[038] In an embodiment, for the user 108 being a human, the control unit 118 is aware of the contribution (or weight distribution) of the head, the torso, the legs, and the hands to the weight or mass of the human. For example, the torso contributes to 70% of the weight of the human and the head contributes to 30% of the weight of the human. In an embodiment, the orientation of the head, the torso, the legs, and the hands also contribute the same percent to the centre of mass of the rider 108. Thus, the control unit 118 determines the posture data from the orientation of the torso, the legs, hands, and the head and the weightages associated with each of these orientations using the posture determining model.
[039] The control unit 118 now determines a centre of mass Cpr, Cpp (also shown in Figures 6-10) of each of the users 108, 110 based on the determined posture data of each of the users 108, 110 and the weight exerted by the users 108. 110 as obtained by the load sensor 114b. In an embodiment, the control unit 118 determines the centre of mass Cpr, Cpp based on a machine learning technique. Cpr is indicative of the centre of mass of the rider 108, while Cpp is indicative of the centre of mass of the pillion rider 110. In an embodiment, the control unit 118 determines the Cpr and/or Cpp based on presence of the rider 108 and/or the pillion rider 110 on the vehicle 102. Based on the posture data, such as, the rider 108 leaning towards the right, the rider 108 leaning towards left, the rider 108 bent towards front, etc., the control unit 118 may determine the centre of mass Cpr of the rider 108 with respect to the vehicle 102.
[040] The control unit 118 is further configured to estimate the centre of mass C of the vehicle 102 based on the weight of the vehicle 102 and the orientation of the vehicle 102 relative to ground. Based on the centre of mass C of the vehicle 102 and the centre of mass Cpr, Cpp of the rider 108 and/or the pillion rider 110 or rear mounted load 120, the control unit 118 determines the centre of mass C of the vehicle with the one or more users 108, 110. The control unit 118 is configured to control the operating parameters of the vehicle 102 based on the estimated centre of mass C of the vehicle 102 with the one or more users 108, 110. In an embodiment, the control unit 118 is adapted to compare the estimated centre of mass C of the vehicle 102 with the one or more users 108, 110 with a reference centre of mass that is to be maintained for current operating conditions. Upon comparison, if the estimated centre of mass C of the vehicle 102 with the one or more users 108. 110 is greater than the reference centre of mass for the current operating conditions, the control unit 118 is adapted to control the operating parameters of the vehicle 102, in order to provide a safer riding experience to the rider 108.
[041] Referring to Figure 9 in conjunction with Figures 1-4, a flow diagram depicting a method 900 for determining the centre of mass Cpr (as shown in Figure 6) of the rider 108 is provided.
[042] At step 902, the control unit 118 is adapted to set an initial centre of mass Cpr of the rider 108. This step initializes the system 100, so that real-time determination of the centre of mass Cpr is enabled. The method 900 then moves to step 904.
[043] At step 904, the control unit 118 determines or acquires load exerted on the seat of the vehicle 102 by the rider 108 through the load sensor 114b. In an embodiment, the load sensor 114b may determine the load acting on the seat to be weight of the rider 108 in absence of the pillion rider 110 and the rear mounted load 120. As an example, the load sensor 114b may determine the load or weight of the rider 108 as 75 Kg. The method 900 then moves to step 906.
[044] At step 906, the control unit 118 acquires the head movement data (as shown in Figures 5a and 5b) of the rider 108 from the first sensor 104a disposed in the helmet 106 worn by the rider 108. The head movement data provides data pertaining to tilt or orientation of the rider 108 seated on the vehicle 102. Upon acquiring the head movement data of the rider 108, the method 900 moves to step 908.
[045] At step 908, the control unit 118 preprocesses the head movement data of the rider 108. In an embodiment, the control unit 118 preprocesses the head movement data by filtering (as shown in Figure 6) the data by conventional filtering techniques known in the art, at a required sampling rate or frequency. In an embodiment, the control unit 118 preprocesses or filters the head movement data along with the data pertaining to operating parameters of the vehicle 102. Upon filtering, the control unit 118 at step 910 fuses the data received from the first sensor along with the operating parameters of the vehicle 102. In an embodiment, the control unit 118 fuses the data based on the sensor-fusion technique. In an embodiment, when the head movement data is from the IMUs (for e.g. as shown in Figures 5a and 5b), the control unit 118 performs fusion of the head movement data in X, Y, Z co-ordinates (referenced as 132 for X co-ordinate, 134 for Y co-ordinate and 136 for Z co-ordinate in Figures 5a and 5b). From the head movement data, the control unit 118 determines roll data (depicted as 138 in Figure 6), yaw data, and pitch data (depicted as 140 in Figure 6) as part of the steps 908 and 910. The control unit 118 determines the head orientation of the users 108, 110 with respect to the ground surface, that is the angle formed by the head of the rider 108 and the pillion rider 110 with respect X, Y, Z -axis in 3D space. Thereafter, the method 900 moves to step 912.
[046] At step 912, the control unit 118 determines the orientation of the head of the rider 108 with respect to the vehicle 102. For determining the head orientation of the rider 108 with respect to the vehicle 102, the control unit 118 determines the head orientation of the rider 108 with respect to the ground surface. Thereafter, the control unit 118 extrapolates the head orientation of the rider 108 with respect to the ground surface along with orientation of the vehicle 102 determined by the vehicle tilt-sensor 112, for determining the head orientation of the rider 108 with respect to the vehicle 102. As an example, considering that the vehicle 102 is parked uprightly on a center stand (not shown) and the head of the rider 108 is tilted by 5 degrees in a left direction, the control unit 118 determines the head orientation of the rider 108 as 5 degrees with respect to the ground surface. Also, as the vehicle 102 is in a standstill condition and parked on the center stand, the orientation of the vehicle 102 is determined as zero degrees by the vehicle tilt-sensor 112. As such, upon fusing the data, the control unit 118 determines the head orientation of the rider 108 with respect to the vehicle 102 as 5 degrees. At step 914, the control unit 118 determines the head centre of mass Chr of the rider 108. At this scenario, the method 900 moves to step 916.
[047] At step 916, the control unit 118 acquires the load or weight exerted by the rider 108 on the side of inclination through the force sensors 114a. Considering the previous example, the rider 108 being inclined at on the left side of the vehicle 102, the weight of the foot of the rider 108 exerts a greater force on force sensor 114a provided on the left foot peg than on the right foot peg. Upon receiving the data from the force sensors 114a, the control unit 118 filters noise in the data generated by the force sensors 114a at step 918. The data from the force sensors 114a provides an indication of the posture or the stance or position of the rider 108 on the vehicle 102. In an embodiment, based on the interrelation between the torso orientation and the head orientation of the rider 108, the control unit 118 may determine the orientation of the torso of the rider 108. For example, if the head orientation of the rider 108 is determined to be tilted by 20 degrees from the erect position, the control unit 118 can determine the torso orientation to be 24 degrees from the erect position. Further, from the data from the user data sensors 114 and the orientation of the torso, the control unit 118 determines orientation of the legs and hands of the rider 108 (i.e. the position-related data). Further, the control unit 118 is aware of the contribution of the head, the torso, the legs, and the hands to the weight or mass of the rider 108. Thus, the control unit 118 determines the posture data from the orientation of the torso, the legs, hands, and the head and the weightages associated with each of these orientations of the rider 108 using the posture determining model. The method 900 then moves to step 920.
[048] At step 920, the control unit 118 upon acquiring the posture data of the rider 108, determines the posture of the rider 108 with respect to the vehicle 102. As an example, the control unit 118 based on the posture data determines the rider 108 leaning towards the right or the rider 108 leaning towards left or the rider 108 bent towards front, etc. Thereafter, the method 900 moves to step 922.
[049] At step 922, the control unit 118 determines the centre of mass Cpr (also shown in Figures 6-8) of the rider 108 with respect to the vehicle 102, based on the determined posture data at step 920.
[050] Referring to Figure 10, a flow diagram depicting a method 1000 for determining the centre of mass Cpp (also shown in Figures 6-8) of the pillion rider 110 is provided.
[051] At step 1002, the control unit 118 is adapted to set an initial centre of mass Cpp of the pillion rider 110. This step initializes the system 100, so that real-time determination of the centre of mass Cpp is enabled. The method 1000 then moves to step 1004.
[052] At step 1004, the control unit 118 determines or acquires load exerted on the seat of the vehicle 102 by the pillion rider 110 through the load sensor 114b. In an embodiment, the load sensor 114b may determine the load acting on the seat to be weight of the pillion rider 108 in absence of the rear mounted load 120. As an example, the load sensor 114b may determine the load or weight of the pillion rider 110 as 75 Kg. The method 1000 then moves to step 1006.
[053] At step 1006, the control unit 118 acquires the head movement data (as shown in Figures 5a and 5b) of the pillion rider 110 from the first sensor 104a disposed in the helmet 106 worn by the pillion rider 110. The head movement data provides data pertaining to tilt or orientation of the pillion rider 110 seated on the vehicle 102. Also, the control unit 118 acquires the data pertaining to the orientation of the vehicle 102 from the vehicle tilt-sensor 112. Upon acquiring the head movement data of the rider 108, the method 1000 moves to step 1008.
[054] At step 1008, the control unit 118 preprocesses the head movement data of the pillion rider 110. In an embodiment, the control unit 118 preprocesses the head movement data by filtering the data by conventional filtering techniques known in the art, at the required sampling rate. In an embodiment, the control unit 118 preprocesses or filters (as shown in Figure 6) the head movement data along with the data pertaining to operating parameters of the vehicle 102. Upon filtering, the control unit 118 at step 1010 fuses the data received from the first sensor along with the operating parameters of the vehicle 102. In an embodiment, the control unit 118 fuses the data based on the sensor-fusion technique (as already described in Figure 9). Thereafter, the method 1000 moves to step 1012.
[055] At step 1012, the control unit 118 determines orientation of the head of the pillion rider 110 with respect to the vehicle 102. For determining the head orientation of the pillion rider 110 with respect to the vehicle 102, the control unit 118 determines the head orientation of the pillion rider 110 with respect to the ground surface. Thereafter, the control unit 118 extrapolates the head orientation of the pillion rider 110 with respect to the ground surface along with orientation of the vehicle 102 determined by the vehicle tilt-sensor 112, for determining the head orientation of the pillion rider 110 with respect to the vehicle 102. As an example, considering that the vehicle 102 is parked uprightly on a center stand (not shown) and the head of the pillion rider 110 is tilted by 3 degrees in a left direction, the control unit 118 determines the head orientation of the pillion rider 110 as 3 degrees with respect to the ground surface. Also, as the vehicle 102 is in the standstill condition and parked on the center stand, the orientation of the vehicle 102 is determined as zero degrees by the vehicle tilt-sensor 112. As such, upon fusing the data, the control unit 118 determines the head orientation of the pillion rider 110 with respect to the vehicle 102 as 3 degrees. At step 1014, the control unit 118 determines the head centre of mass Chp of the pillion rider 110. At this scenario, the method 1000 moves to step 1016.
[056] At step 1016, the control unit 118 acquires the load or weight exerted by the pillion rider 110 on the side of inclination through the force sensors 114a. Considering the previous example, the pillion rider 110 being inclined on the left side of the vehicle 102, the weight of the foot of the pillion rider 110 exerts a greater force on the force sensor 114a provided on the left foot peg than on the right foot peg. Upon receiving the data from the force sensors 114a, the control unit 118 filters (as shown in Figure 6) noise in the data generated by the force sensors 114a. In an embodiment, based on the interrelation between the torso orientation and the head orientation of the pillion rider 110, the control unit 118 may determine the orientation of the torso of the pillion rider 110. For example, if the head orientation of the pillion rider 110 is determined to be tilted by 20 degrees from the erect position, the control unit 118 can determine the torso orientation to be 24 degrees from the erect position. Further, from the data from the user data sensors 114 and the orientation of the torso, the control unit 118 determines orientation of the legs and hands of the pillion rider 110 (i.e. the position-related data). Further, the control unit 118 is aware of the contribution of the head, the torso, the legs, and the hands to the weight or mass of the pillion rider 110. Thus, the control unit 118 determines the posture data from the orientation of the torso, the legs, hands, and the head and the weightages associated with each of these orientations of the rider 108 using the posture determining model. The method 1000 then moves to step 1018.
[057] At step 1018, the control unit 118 determines the centre of mass Cpp (also shown in Figures 6-10) of the pillion rider 110 with respect to the vehicle 102, based on the determined posture data at step 1016.
[058] Referring to Figure 11 in conjunction with Figures 9 and 10, a flow diagram depicting a method 1100 for estimating the centre of mass C of the vehicle 102 with the one or more users 108, 110 is provided.
[059] At step 1102, the centre of mass Cv (as shown in Figures 7-10) of the vehicle 102 is determined. In an embodiment, the centre of mass of the vehicle 102 may be determined based on the data received from the vehicle tilt-sensor 112. In another embodiment, apart from the data received from the vehicle tilt-sensor 112, the control unit 118 may also consider load data from the rear mounting load 120 that may be provided on the vehicle 102 from the load sensor 114b. The method 1100 then moves to step 1104.
[060] At step 1104, the control unit 118 sums or adds the centre of mass Cpr of the rider 108 with respect to vehicle 102 determined at step 922, the centre of mass Cpr of the pillion rider 110 with respect to vehicle 102 determined at step 1018 and the vehicle centre of mass Cv determined at step 1102 for determining the centre of mass C of the vehicle 102 with the one or more users 108, 110. For example, the control unit 118 may determine the centre of mass C of the vehicle 102 with the one or more users 108, 110 to be 63.3 cms and positioned below the centre of mass of the rider 108.
[061] Figure 12 is a flow diagram depicting a method 1200 for estimating the centre of mass C of the vehicle 102 with the one or more users 108, 110. In an embodiment, the method 1200 estimates the centre of mass C based on seating of the rider 108 and/or the pillion 110 on the vehicle 102. The method 1200 also estimates the centre of mass C based on the rear mounting load 120 provided behind the rider 108. As such, description pertaining to the method 1200 is generalized for the sake of brevity and accordingly, the load, the posture data and the head orientation data received by the control unit 118 is to be suitably construed within the ambit of the present invention.
[062] At step 1202, as already described in Figures 9 and 10, the control unit 118 receives the head movement data of each of the users 108, 110 from each of the first sensor 104a, 104b, the position-related data from each of the user-data sensors 114, the operating parameters of the vehicle from the vehicle-data sensors 116 and the orientation of the vehicle 102 from the vehicle-tilt sensor 112.
[063] At step 1204, the control unit 118 determines the head centre of mass Chr, Chp for the users 108, 110 based on the data received from each of the first sensors 104a, 104b.
[064] Thereafter, at step 1206 the control unit 110 determines the posture data of each user 108, 110 based on the position-related data received from the user-data sensors 114. At step 1208, the orientation of the vehicle 102 with respect to the vertical plane Y-Y’ is then determined based on the data received from the vehicle-tilt sensor 112. The centre of mass C of the vehicle 102 with users is then determined at step 1210 (as already described in Figures 9 and 10) based on the head centre of mass Chr, Chp of each user 108, 110, the posture data of each user 108, 110, the operating parameters of the vehicle 102 and the orientation of the vehicle 102 through the posture determining model. In an embodiment, as described in Figure 11, the centre of mass Cv of the vehicle 102 may also be considered along with the head centre of mass Chr, Chp of each user 108, 110, the posture data of each user 108, 110, the operating parameters of the vehicle 102 and the orientation of the vehicle 102 to determine the centre of mass C of the vehicle 102 with users.
[065] The claimed invention as disclosed above is not routine, conventional or well understood in the art, as the claimed aspects enable the following solutions to the existing problems in conventional technologies. Specifically, the claimed aspect of determining the centre of mass of the rider and/or the pillion rider and/or the rear mounted load ensures accuracy in determining the centre of mass of the vehicle. Particularly, determining the posture data of the rider and/or the pillion rider enhances the accuracy in estimating the centre of mass of the vehicle. Further, monitoring the operating parameters and vehicle operating regions along with the estimation of the centre of mass of the vehicle, enables the control unit to control operation of the vehicle suitably. Consequently, enhancing riding experience for the rider and/or the pillion rider. Moreover, monitoring and controlling of the operating parameters along with estimation of the centre of mass of the vehicle enhances safety. Additionally, the present invention is capable of enabling estimation of centre of mass of the vehicle with both the rider and the pillion rider. Also, in the present invention not only can the posture and the centre of mass of each of the rider and the pillion rider be determined individually, but also the combined system centre of mass can be determined. Furthermore, a single first sensor is employed for each user i.e. the rider or the pillion rider for determining their respective centre of mass. Thus, mitigating the need for multiple sensors for determining the centre of mass, rendering the system of the present invention inexpensive and less cumbersome.
Reference numerals
100 - System for estimating centre of mass of two-wheeled vehicle with one or more users
102 - Two-wheeled vehicle
104a, 104b - First sensor
106 - Helmet
108 - Rider
110 - Pillion rider
112 - Vehicle-tilt sensor
114 - User-data sensors
114a - Force sensors
114b - Load sensors
116 - Vehicle-data sensors
116a - Throttle position sensor
116b - Speed sensor
118 - Control unit
120 - Rear mounted load
122 - Input/Output module
124 - Memory unit
126 - Processing module
128 - Analytic module
130 - Power supply
132 - X co-ordinate values
134 - Y co-ordinate values
136 - Z co-ordinate values
138 - Roll data
140 - Pitch data
Chr - Head centre of mass of rider
Chp - Head centre of mass of pillion rider
Y-Y’ - Vertical plane of two-wheeled vehicle
C - Centre of mass of two-wheeled vehicle with one or more users
Cpr - Centre of mass of rider
Cpp - Centre of mass of pillion rider
, Claims:WE CLAIM:
1. A system (100) for estimating centre of mass (C) of a two-wheeled vehicle (102) with one or more users (108, 110), the system (100) comprising:
at least one first sensor (104a, 104b) disposed in a helmet (106) worn by each of the one or more users (108, 110) of the two-wheeled vehicle (102), the one or more users (108, 110) comprising at least one of a rider (108) and a pillion-rider (110) of the two-wheeled vehicle (102), wherein the at least one first sensor (104a, 104b) being adapted to generate head movement data of the each of the one or more users (108, 110);
a vehicle-tilt sensor (112) disposed in the two-wheeled vehicle (102), the vehicle-tilt sensor (112) being adapted to monitor orientation of the two-wheeled vehicle (102) with respect to a vertical plane (Y-Y’) of the two-wheeled vehicle (102);
one or more user-data sensors (114) disposed in the two-wheeled vehicle (102), the one or more user-data sensors (114) being adapted to procure a position-related data of the each of the one or more users (108, 110);
one or more vehicle-data sensors (116) disposed in the two-wheeled vehicle (102), the one or more vehicle-data sensors (116) being adapted to monitor operating parameters of the two-wheeled vehicle (102); and
a control unit (118) disposed in the two-wheeled vehicle (102) and communicably coupled with the at least one first sensor (104a, 104b), the one or more user-data sensors (114) and the one or more vehicle-data sensors (116), the control unit (118) being configured to:
receive, the head movement data of each of the one or more users (108, 110) from each of the at least one first sensor (104a, 104b), the position-related data from each of the one or more user-data sensors (114), the operating parameters of the two-wheeled vehicle (102) from the one or more vehicle-data sensors (116) and orientation of the two-wheeled vehicle (102) from the vehicle-tilt sensor (112);
determine, a head centre of mass (Chr, Chp) for each of the one or more users (108, 110) of the two-wheeled vehicle (102) based on the data received from each of the at least one first sensor (104a, 104b);
determine, posture data of each of the one or more users (108, 110) based on the position-related data received from the one or more user-data sensors (114);
determine, orientation of the two-wheeled vehicle (102) with respect to the vertical plane (Y-Y’) based on the data received from the vehicle-tilt sensor (112); and
estimate, the centre of mass (C) of the two-wheeled vehicle with the one or more users (108, 110) based on the head centre of mass (Chr, Chp) of each of the one or more users (108, 110), the posture data of each of the one or more users (108, 110), and the orientation of the two-wheeled vehicle (102).
2. The system (100) as claimed in claim 1, wherein the one or more user-data sensors (114) comprise:
at least one force sensor (114a) disposed on each of a foot peg of the two-wheeled vehicle (102), the at least one force sensor (114a) being adapted to monitor force exerted by each of the one or more users (108, 110) on each of the foot peg of the two-wheeled vehicle (102); and
a load sensor (114b) disposed below a seat of the two-wheeled vehicle (102), the load sensor (114b) being adapted to monitor weight exerted on the seat of the two-wheeled vehicle (102).
3. The system (100) as claimed in claim 2, wherein the load sensor (114b) is adapted to monitor load exerted on the seat based on at least one of weight of each of the one or more users (108, 110) seated on the two-wheeled vehicle (102) and a rear mounted load (120) disposed on the two-wheeled vehicle (102).
4. The system (100) as claimed in claim 1, wherein the one or more vehicle-data sensors (116) comprise:
a throttle-position sensor (116a) adapted to monitor a rate of change of a throttle position in a throttle body of the two-wheeled vehicle (102); and
a speed sensor (116b) adapted to monitor speed of the two-wheeled vehicle (102).
5. The system (100) as claimed in claim 1, wherein the control unit (118) is adapted to preprocess the head movement data of each of the one or more users (108, 110), the position-related data of each of the one or more users (108, 110), the operating parameters of the two-wheeled vehicle (102) and the orientation of the two-wheeled vehicle (102) by sampling and prefiltering the received head movement data, the position-related data, the operating parameters, and the orientation of the two-wheeled vehicle (102) at a pre-defined sampling rate.
6. The system (100) as claimed in claim 5, wherein the control unit (118) upon filtering, is configured to fuse the head movement data of each of the one or more users (108, 110), received from each of the at least one first sensor (104a, 104b) through a sensor-fusion technique for determining orientation of the head of each of the one or more users (108, 110) with respect to a ground surface.
7. The system (100) as claimed in claim 6, wherein the control unit (118) is configured to determine orientation of the head of each of the one or more users (108, 110) with respect to the two-wheeled vehicle (102), based on the determined orientation of the head of each of the one or more users (108, 110) with respect to the ground surface and the orientation of the two-wheeled vehicle (102).
8. The system (102) as claimed in claim 7, wherein the control unit (118) is configured to determine the posture data of each of the one or more users (108, 110) through a posture determining model based on the head movement data of each of the one or more users (108, 110) and force exerted by each of the one or more users (108, 110) on each foot peg from one or more force sensors (114a) of the one or more user-data sensors (114).
9. The system (100) as claimed in claim 8, wherein the control unit (118) is configured to determine a centre of mass (Cpr, Cpp) of each of the one or more users (108, 110) based on the posture data of each of the one or more users (108, 110) and load data received from a load sensor (114b) of the one or more user-data sensors (114).
10. The system (100) as claimed in claim 9, wherein the control unit (118) is configured to determine the centre of mass (C) of the two-wheeled vehicle (102) based on the centre of mass (Cpr, Cpp) of each of the one or more users (108, 110) and the orientation of the two-wheeled vehicle (102) determined by the vehicle-tilt sensor (112).
11. The system (100) as claimed in claim 1, wherein the control unit (118) is configured to control the operating parameters of the two-wheeled vehicle (102) based on the estimated centre of mass (C) of the two-wheeled vehicle (102).
12. A method (1200) for estimating a centre of mass (C) of a two-wheeled vehicle (102) with one or more users (108, 110), the method comprising:
receiving (1202), by a control unit (118), head movement data of each of the one or more users (108, 110) from each of at least one first sensor (104a, 104b), a position-related data from each of one or more user-data sensors (114), operating parameters of the two-wheeled vehicle (102) from one or more vehicle-data sensors (116) and orientation of the two-wheeled vehicle (102) from a vehicle-tilt sensor (112);
determining (1204), by the control unit (118), a head centre of mass (Chr, Chp) for each of the one or more users (108, 110) based on the data received from each of the at least one first sensor (104a, 104b);
determining (1206), by the control unit (118), a posture data of each of the one or more users (108, 110) based on the position-related data received from the one or more user-data sensors (114);
determining (1208), by the control unit (118), orientation of the two-wheeled vehicle (102) with respect to the vertical plane (Y-Y’) based on the data received from the vehicle-tilt sensor (112); and
estimating (1210), the centre of mass (C) of the two-wheeled vehicle based on the head centre of mass (Chr, Chp) of each of the one or more users (108, 110), the posture data of each of the one or more users (108, 110), and the orientation of the two-wheeled vehicle (102).
13. The method (1200) as claimed in claim 12 comprising pre-processing, by the control unit (118), the head movement data of each of the one or more users (108, 110), the position-related data of each of the one or more users (108, 110), the operating parameters of the two-wheeled vehicle (102) by sampling and prefiltering the received head movement data, the position-related data, the operating parameters and the orientation of the two-wheeled vehicle (102) at a pre-defined sampling rate.
14. The method (1200) as claimed in claim 13 comprising fusing, by the control unit (118) the head movement data of each of the one or more users (108, 110), through a sensor-fusion technique for determining orientation of the head of each of the one or more users (108, 110) with respect to a ground surface.
15. The method (1200) as claimed in claim 14 comprising determining, by the control unit (118), orientation of the head of each of the one or more users (108, 110) with respect to the two-wheeled vehicle (102), based on the determined orientation of the head of each of the one or more users (108, 110) with respect to the ground surface and the orientation of the two-wheeled vehicle (102).
16. The method (1200) as claimed in claim 15 comprising determining, by the control unit (118) the posture data of each of the one or more users (108, 110) through a posture determining model based on the head movement data of each of the one or more users (108, 110) and force exerted by each of the one or more users (108, 110) on each foot peg from one or more force sensors (114a) of the one or more user-data sensors (114).
17. The method (1200) as claimed in claim 16 determining, by the control unit (118), a centre of mass (Cpr, Cpp) of each of the one or more users (108, 110) based on the posture data of each of the one or more users (108, 110) and a load data received from a load sensor (114b) of the one or more user-data sensors (114).
18. The method (1200) as claimed in claim 17 comprising determining, by the control unit (118) the centre of mass (C) of the two-wheeled vehicle (102) with the one or more users (108, 110) based on the centre of mass (Cpr, Cpp) of each of the one or more users (108, 110) and the orientation of the two-wheeled vehicle (102) determined by the vehicle-tilt sensor (112).
19. The method (1200) as claimed in claim 12 comprising controlling, by the control unit (118) the operating parameters of the two-wheeled vehicle (102) with the one or more users (108, 110) based on the estimated centre of mass (C) of the two-wheeled vehicle (102).
Dated this 22nd day of November 2022
TVS MOTOR COMPANY LIMITED
By their Agent & Attorney
(Nikhil Ranjan)
of Khaitan & Co
Reg No IN/PA-1471
| # | Name | Date |
|---|---|---|
| 1 | 202241067105-STATEMENT OF UNDERTAKING (FORM 3) [22-11-2022(online)].pdf | 2022-11-22 |
| 2 | 202241067105-REQUEST FOR EXAMINATION (FORM-18) [22-11-2022(online)].pdf | 2022-11-22 |
| 3 | 202241067105-PROOF OF RIGHT [22-11-2022(online)].pdf | 2022-11-22 |
| 4 | 202241067105-POWER OF AUTHORITY [22-11-2022(online)].pdf | 2022-11-22 |
| 5 | 202241067105-FORM 18 [22-11-2022(online)].pdf | 2022-11-22 |
| 6 | 202241067105-FORM 1 [22-11-2022(online)].pdf | 2022-11-22 |
| 7 | 202241067105-FIGURE OF ABSTRACT [22-11-2022(online)].pdf | 2022-11-22 |
| 8 | 202241067105-DRAWINGS [22-11-2022(online)].pdf | 2022-11-22 |
| 9 | 202241067105-DECLARATION OF INVENTORSHIP (FORM 5) [22-11-2022(online)].pdf | 2022-11-22 |
| 10 | 202241067105-COMPLETE SPECIFICATION [22-11-2022(online)].pdf | 2022-11-22 |