Sign In to Follow Application
View All Documents & Correspondence

System And Method For Providing Riding Assistance In A Saddle Type Vehicle

Abstract: The present invention provides a system (100) and a method (400) for providing riding assistance in a saddle type vehicle (102). The system (100) comprises a control unit (106). The control unit (106) is configured to generate a local map pertaining to one or more objects around the saddle type vehicle (102). The control unit (106) is configured to determine a parking space based on the local map. The control unit (106) is configured to operate one or more vehicle parameters to autonomously traverse the saddle type vehicle (102) to the determined parking space, thereby providing riding assistance for parking the saddle type vehicle (102). Reference Figure 2

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
05 September 2023
Publication Number
10/2025
Publication Type
INA
Invention Field
MECHANICAL ENGINEERING
Status
Email
Parent Application

Applicants

TVS MOTOR COMPANY LIMITED
“Chaitanya”, No.12 Khader Nawaz Khan Road, Nungambakkam, Chennai 600 006, Tamil Nadu, India

Inventors

1. THIRUNAVUKKARASU SENTHIL
TVS Motor Company Limited “Chaitanya”, No.12 Khader Nawaz Khan Road, Nungambakkam, Chennai 600 006, Tamil Nadu, India
2. RAHUL KAUSHIK
TVS Motor Company Limited “Chaitanya”, No.12 Khader Nawaz Khan Road, Nungambakkam, Chennai 600 006, Tamil Nadu, India
3. CHAITANYA RAJENDRA ZANPURE
TVS Motor Company Limited “Chaitanya”, No.12 Khader Nawaz Khan Road, Nungambakkam, Chennai 600 006, Tamil Nadu, India
4. DATTA RAJARAM SAGARE
TVS Motor Company Limited, “Chaitanya”, No.12 Khader Nawaz Khan Road, Nungambakkam, Chennai 600 006, Tamil Nadu, India

Specification

Description:FIELD OF THE INVENTION
[001] The present invention relates to a system and a method for providing riding assistance in a saddle type vehicle, and more specifically to a riding assistance for automated parking and reverse parking in a saddle type vehicle.

BACKGROUND OF THE INVENTION
[002] In vehicles such as saddle type vehicles, a rider benefits when riding assistance is provided by the saddle type vehicle. The rider can comprehend the status of various components of the saddle type vehicle as well as the surrounding environment. Typically, the rider is skilled in driving the saddle-type vehicle in a forward direction only. Accordingly, the rider is skilled in parking the saddle-type vehicle in the forward direction.
[003] However, riding the saddle type vehicle in a reverse direction is a challenge for the rider, as reverse riding requires the rider to turn his head or body to observe surroundings behind the saddle type vehicle, for preventing collision with other objects and/or people that are present behind the two-wheeled vehicle. More specifically, parking the saddle type vehicle while reverse riding is difficult and may lead to collisions, which is undesirable. Additionally, while reverse riding, the rider may miss blind spots and the collisions can cause damage to the rider and the saddle type vehicle. Moreover, the rider is required to turn his head or body during the reverse parking of the saddle type vehicle, which is non-ergonomic and makes the rider uncomfortable, deteriorating riding experience of the rider in the saddle type vehicle. In addition, the rider is required to spend significant time in parking the saddle type vehicle.
[004] Thus, there is a need for a system and a method for providing riding assistance in the saddle type vehicle, which addresses at least one or more aforementioned problems.

SUMMARY OF THE INVENTION
[005] In one aspect of the invention, a system for providing riding assistance in a saddle type vehicle is disclosed. The system comprises a control unit. The control unit is configured to generate a local map pertaining to one or more objects around the saddle type vehicle. The control unit is configured to determine a parking space based on the local map. The control unit is configured to operate one or more vehicle parameters to autonomously traverse the saddle type vehicle to the determined parking space, thereby providing riding assistance for parking the saddle type vehicle.
[006] In an embodiment, the system comprises a sensing unit mounted on the saddle type vehicle. The sensing unit is communicatively coupled to the control unit and configured to procure information pertaining to surroundings of the saddle type vehicle. The sensing unit comprises a range detection and ranging (RADAR) unit, one or more image sensors, a light detection and ranging (LIDAR) unit, one or more ultrasonic sensors and one or more proximity sensors. The RADAR unit is configured to generate radar information pertaining to the surroundings of the vehicle. The one or more image sensors is configured to generate image information pertaining to the surroundings of the vehicle. The LIDAR unit is configured to generate surface information pertaining to the surroundings of the saddle type vehicle. The one or more ultrasonic sensors is configured to generate distance information pertaining to a distance between the saddle type vehicle and the surroundings. The one or more proximity sensors is configured to generate obstacle distance information pertaining to a proximity of the saddle type vehicle from obstacles in the surroundings.
[007] In an embodiment, the control unit is configured to receive the information pertaining to the surroundings of the vehicle from the sensing unit. The control unit is configured to determine the one or more objects based on the information pertaining to the surroundings of the vehicle.
[008] In an embodiment, the system comprises a steering angle sensor and a roll angle sensor disposed in the saddle type vehicle. The steering angle sensor is configured to procure steering angle data of the saddle type vehicle. The roll angle sensor is configured to procure roll angle data of the saddle type vehicle.
[009] In an embodiment, the control unit is configured to receive at least one of the steering angle data and the roll angle data from the steering angle sensor and the roll angle sensor. The control unit is configured to estimate a vehicle orientation based on at least one of the steering angle data and the roll angle data. The control unit is configured to generate a traversal path based on the estimated vehicle orientation, and dimensions of the saddle type vehicle. The traversal path is adapted to route the saddle type vehicle towards the determined parking space for parking the saddle type vehicle.
[010] In an embodiment, the control unit is communicatively coupled to an engine management system (EMS). The EMS is adapted to procure the one or more vehicle parameters of the saddle type vehicle. The one or more vehicle parameters comprises a speed of the saddle type vehicle, a throttle opening position of a throttle body of the saddle type vehicle, an engine speed, a charge level of a battery in the saddle type vehicle, an ignition-OFF condition of the saddle type vehicle, and an electric start status of the saddle type vehicle.
[011] In an embodiment, the control unit is configured to receive the information pertaining to the surroundings of the saddle type vehicle, the one or more objects, the estimated vehicle orientation, and the dimensions of the saddle type vehicle. The control unit is configured to segment the local map into at least one potential parking space for the saddle type vehicle. The control unit is configured to determine a score for each of the at least one potential parking space. The control unit is configured to select one of the at least one potential parking space as the parking space for the saddle type vehicle based on the determined score.
[012] In an embodiment, the control unit 106 is adapted to classify the detected objects into specific classes which may be based on dimensions, shape and the like. For each detected object, the control unit 106 may produce a set of probabilities indicating the likelihood of the object belonging to a different predefined class. Thereafter, the control unit 106 determines a confidence score which is calculated based on both the classification probabilities and the quality of the one or more objects determined. Subsequently, the control unit 106 executes a non-maximum suppression technique to filter out duplicate detections and retain the highest-scoring parking space. Such a determination, helps in avoiding detection of multiple spaces around the same object and ensures that the final output contains the most confident and accurate detections. Thereafter, a threshold is applied to the confidence scores for determining which parking space to be considered valid. Detections with scores above the threshold are retained as potential parking space detections, while those below the threshold are discarded. The control unit 106 chooses the space with the highest score from the available potential parking space, as the parking space for the vehicle 102.
[013] In an embodiment, the system comprises an instrument cluster disposed in the saddle type vehicle. The instrument cluster is communicatively coupled to the control unit. The instrument cluster is configured to alert a user of the saddle type vehicle upon detecting the parking space for the saddle type vehicle. The instrument cluster is configured to display the one or more objects, the local map, and the traversal path to the user of the saddle type vehicle.
[014] In an embodiment, the system comprises a parking switch disposed in the saddle type vehicle. The system is adapted to provide riding assistance for parking the saddle type vehicle when the parking switch is operated to an ON condition.
[015] In an embodiment, the system is adapted to provide riding assistance for parking the saddle type vehicle during a reverse parking condition of the saddle type vehicle.
[016] In an embodiment, the system comprises a database communicatively coupled to the control unit. The database is configured to procure information pertaining to surroundings of the saddle type vehicle.
[017] In another aspect of the invention, a method for providing riding assistance in a saddle type vehicle is disclosed. The method comprises generating, by a control unit, a local map pertaining to one or more objects around the saddle type vehicle. The method comprises determining, by the control unit, a parking space based on the local map. The method comprises operating, by the control unit, one or more vehicle parameters to autonomously traverse the saddle type vehicle to the determined parking space, thereby providing riding assistance for parking the saddle type vehicle.

BRIEF DESCRIPTION OF THE DRAWINGS
[018] Reference will be made to embodiments of the invention, examples of which may be illustrated in accompanying figures. These figures are intended to be illustrative, not limiting. Although the invention is generally described in context of these embodiments, it should be understood that it is not intended to limit the scope of the invention to these particular embodiments.
Figure 1 is a top perspective view of a vehicle, in accordance with an embodiment of the present invention.
Figure 2 is a block diagram illustrating a system for providing riding assistance in the vehicle, in accordance with an embodiment of the present invention.
Figures 3A-3D are a flow chart illustrating a method for providing riding assistance in the vehicle, in accordance with an embodiment of the present invention.
Figure 4 is a flow chart illustrating a method for providing riding assistance in the vehicle, in accordance with another embodiment of the present invention.

DETAILED DESCRIPTION OF THE INVENTION
[019] The present invention relates to a system and a method for providing riding assistance in a vehicle. The system in the present invention is configured to determine a parking space for the vehicle and autonomously traverse the vehicle to the determined parking space, thereby providing riding assistance to a user of the vehicle. In the present embodiment, the vehicle is a saddle-type vehicle.
[020] Figure 1 is a perspective view of a vehicle 102 in accordance with an embodiment of the present invention. The vehicle 102 comprises a system 100 for providing riding assistance to the user of the vehicle 102. In an embodiment, the term “riding assistance” refers to assistance provided by the system 100 to the user for parking the vehicle. In the present embodiment, the vehicle 102 is a saddle-type vehicle or a two-wheeled vehicle.
[021] Figure 2 is a block diagram illustrating the system 100 for providing riding assistance in the vehicle 102, in accordance with an embodiment of the present invention. Referring to Figure 2 in conjunction with Figure 1, the system 100 comprises a sensing unit 104 mounted on the vehicle 102. The sensing unit 104 is configured to procure information pertaining to surroundings of the vehicle 102. In one embodiment, the sensing unit 104 is viewing rearwardly from the vehicle 102. In another embodiment, the sensing unit 104 is viewing in at least one of forward, rearward, left, and right directions of the vehicle 102.
[022] The sensing unit 104 comprises at least one of a Range Detection and Ranging (RADAR) unit 108, one or more image sensors 110, a Light Detection and Ranging (LIDAR) unit 112, one or more ultrasonic sensors 114, and one or more proximity sensors 116. The RADAR unit 108 is configured to procure radar information pertaining to the surroundings of the vehicle 102. The radar information corresponds to distance, angle, and radial velocity of one or more objects in the surroundings of the vehicle 102. The LIDAR unit 112 is configured to procure surface information pertaining to the surroundings of the vehicle 102. The surface information corresponds to a three-dimensional model of the surroundings of the vehicle 102. The one or more image sensors 110 are configured to procure image information pertaining to the surroundings of the vehicle 102. The image information corresponds to visual data of the surroundings of the vehicle 102.
[023] The one or more ultrasonic sensors 114 are configured to procure distance information pertaining to a distance between the vehicle 102 and the surroundings. The one or more proximity sensors 116 are configured to generate obstacle distance information pertaining to a proximity of the vehicle 102 from obstacles in the surroundings. In one example, the proximity sensors 116 may be one of infrared sensors, laser sensors, and the like.
[024] The information pertaining to the surroundings of the vehicle 102 comprises at least one of the radar information, the image information, the surface information, the distance information, and the obstacle distance information. In one embodiment, the information pertains to the surroundings behind the vehicle 102. In another embodiment, the RADAR unit 108, the one or more image sensors 110, the LIDAR unit 112, the one or more ultrasonic sensors 114, and the one or more proximity sensors 116 are mounted in at least one of forward, rearward, left, and right portions of the vehicle 102. Accordingly, the RADAR unit 108, the one or more image sensors 110, the LIDAR unit 112, the one or more ultrasonic sensors 114, and the one or more proximity sensors 116 procure information in at least one of forward, rearward, left, and right directions of the vehicle 102.
[025] The system 100 comprises a steering angle sensor 118 and a roll angle sensor 120. The steering angle sensor 118 and the roll angle sensor 120 are disposed in the vehicle 102. The steering angle sensor 118 is coupled to a steering unit (not shown) of the vehicle 102. In an embodiment, the steering unit may be a handlebar (not shown) of the vehicle 102. The steering angle sensor 118 is configured to procure steering angle data of the vehicle 102. The steering angle data is indicative of a direction and/or a rotation angle of the steering unit, with respect to a central axis (not shown) of the steering unit. In one embodiment, the steering angle sensor 118 may be one of a potentiometer, an inertial motion unit sensor, a gyroscope, and the like.
[026] The roll angle sensor 120 is configured to procure roll angle data of the vehicle 102 based on an inclination of the vehicle 102 with respect to a vertical axis Y-Y’. In one embodiment, the vertical axis Y-Y’ may be an axis about top-down direction of the vehicle 102. As such, the roll angle sensor 120 is adapted to procure inclination of the vehicle 102 on a left-side or a right-side about the vertical axis Y-Y’. The roll angle data is indicative of a roll or an inclination of the vehicle 102 about the vertical axis Y-Y’. In one embodiment, the roll sensor 120 may be one of an ultrasonic sensor, an inertial motion unit sensor, and the like.
[027] The system 100 comprises a control unit 106 that is coupled wirelessly or by wire with the sensing unit 104, the steering angle sensor 118 and the roll angle sensor 120. In one embodiment, the control unit 106 is coupled to the sensing unit 104, the steering angle sensor 118 and the roll angle sensor 120 using one of a universal serial bus, a Gigabit Multimedia Serial Link, wireless fidelity (Wi-Fi), Bluetooth, and the like.
[028] The control unit 106 is configured to receive at least one of the steering angle data and the roll angle data from the steering angle sensor 118 and the roll angle sensor 120 respectively. The control unit 106 is adapted to determine a steering angle of the steering unit based on the steering angle data procured from the steering angle sensor 118. The control unit 106 adapted to determine inclination or roll of the vehicle 102 based on the roll angle data procured from the roll angle sensor 120. Further, the control unit 106 is configured to estimate a vehicle orientation based on at least one of the steering angle data and the roll angle data. In one embodiment, the vehicle orientation may range from a left direction to a central direction to a right direction of the vehicle 102. In one example, when the vehicle orientation is in the left direction, the vehicle 102 rearwardly traverses in the left direction and when the vehicle orientation is in the right direction, the vehicle 102 rearwardly traverses in the right direction. In another embodiment, the vehicle orientation corresponds to the inclination or roll of the vehicle about the vertical axis Y-Y’ and/or the steering angle of the steering unit of the vehicle 102.
[029] The control unit 106 is configured to receive the information pertaining to the surroundings of the vehicle 10 from the sensing unit 104. The control unit 106 is configured to determine the one or more objects based on the information pertaining to the surroundings of the vehicle 102. The one or more objects includes vehicles, one or more individuals, a wall and the like. In an embodiment, the control unit 106 is adapted to execute one or more computer vision techniques for detecting the one or more objects in the surroundings of the vehicle 102. The control unit 106 associates the surface information corresponding to the three-dimensional model of the surroundings to attributes like color or reflectivity. As a result, a collection of three-dimensional points in the three-dimensional model representing surfaces and the one or more objects in the surroundings is determined by the control unit 106. The control unit 106 is configured to integrate the three-dimensional model pertaining to the surroundings with the distance information, the obstacle distance information, and the one or more objects. As a result, the control unit 106 is configured to generate a local map pertaining to the one or more objects around the vehicle 102. In an embodiment, the one or more objects may be still objects or moving objects. The control unit 106 is configured to update the local map based on real-time information pertaining to the one or more objects present in the surroundings of the vehicle 102.
[030] The control unit 106 is configured to receive the information pertaining to the surroundings of the vehicle 102, the one or more objects, the estimated vehicle orientation, the local map, and dimensions of the vehicle 102. In an embodiment, the dimensions of the vehicle 102 correspond to at least one of length, width, height, maximum steering angle, and turning radius of the vehicle 102.
[031] The control unit 106 analyzes the information, the one or more objects, the estimated vehicle orientation, the local map, and the dimensions pertaining to the vehicle 102 to determine at least one potential parking space for the vehicle 102. The at least one potential parking space corresponds to space available in the surroundings suitable for parking the vehicle 102, which may be between the one or more objects in the local map. The control unit 106 is adapted to select a parking space from the at least one potential parking space for parking the vehicle 102. In order to choose the parking space for the vehicle 102, the control unit 106 is configured to segment the local map into the at least one potential parking space for the vehicle 102. For segmentation, the control unit 106 distinguishes between each of the at least one potential parking space using one or more machine learning models.
[032] In an embodiment, the term “potential parking space” refers to a probable or a prospective space where the vehicle 102 can be parked. In an embodiment, the term “parking space” refers to the space in the surroundings of the vehicle 102, where the vehicle 102 is designated or chosen to be parked.
[033] In an embodiment, the control unit 106 selects a space in the surroundings to be a potential parking space by determining dimensions of the space. If the dimensions of the space is capable of accommodating the vehicle 102, the control unit 106 selects the space as the at least one potential parking space. In other words, if the space is at least equal to the dimensions of the vehicle 102, the control unit 106 may select the space as the at least one potential parking space.
[034] The one or more machine learning models depend on at least one of available datasets from the sensing unit 104, processing capability of the control unit 106, real-time requirements, characteristics of the at least one potential parking space and non-valid areas in the surroundings of the vehicle 102. The available datasets comprise images of at least one of a range of size of each of the at least one potential parking space, the vehicle orientation, lighting conditions in the surroundings, non-valid areas in the surroundings, and background of the surroundings to ensure that the machine learning models can identify the parking space in various situations. The images are annotated with at least one of bounding box coordinates and pixel-level segmentation masks that indicate location and dimensions of the parking spaces.
[035] In one embodiment, the one or more machine learning models comprise convolutional neural networks, mask region-based convolutional neural networks (Mask R-CNN), You Only Look Once (YOLO) model, EfficientDet, and RetinaNet. It shall be obvious to a person skilled in the art that various machine learning models can be implemented to distinguish between the at least one potential parking space and the non-parking spaces without moving away from the scope of the present invention. Thus, the one or more machine learning models are trained using the annotated datasets. The convolutional neural networks operate using image-based object recognition tasks. The convolutional neural networks learn hierarchical features from input images to detect patterns and distinguish between the parking spaces and the non-parking spaces.
[036] In an embodiment, the Mask R-CNN is an instance segmentation model that provides the pixel-level segmentation masks for the one or more objects. As a result, the Mask R-CNN is capable of segmenting and classifying parking spaces, thereby allowing more precision identification of boundaries of the parking spaces. In an embodiment, the YOLO model is an object detection model that provides real-time object detection in a single instance. The YOLO model detects and classifies parking spaces by providing the boundary box coordinates and class probabilities for each of the detected parking spaces.
[037] In an embodiment, the EfficientDet is an efficient and accurate object detection model that combines efficient network architectures with advanced feature fusion techniques. The EfficientDet provides accuracy and computational efficiency, thereby making it suitable for parking space detection tasks. In an embodiment, the RetinaNet is another object detection model that employs a feature pyramid network to detect the one or more objects at multiple scales. The RetinaNet identifies the parking spaces by predicting bounding box coordinates and class probabilities for each of the detected parking spaces. The one or more machine learning models are implemented on validation sets to determine performance of each machine learning model. The validation set corresponds to a set of known parking spaces that are provided to the one or more machine learning models to identify the parking space.
[038] In an embodiment, the control unit 106 distinguishes between the parking spaces and the non-parking spaces using an ensemble of multiple machine learning models to enhance accuracy and robustness of the detection of the parking spaces. The ensemble comprises at least one of the Mask R-CNN, the YOLO model, the EfficientDet, and the RetinaNet. The ensemble of machine learning models is saved in a file or a model checkpoint. In one example, format of the saved ensemble is one of TensorFlow SavedModel and TensorFlow checkpoint. The saved ensemble allows for easy loading and deployment in the system 100.
[039] In order to do so, the control unit 106 is configured to determine a score for each of the at least one potential parking space determined by the ensemble of the machine learning models. The control unit 106 applies fusion mechanisms like weighted averaging, late fusion, and the like to the classified at least one potential parking space of each of the one or more machine learning models. In an embodiment, higher weights may be assigned to the potential parking space of a model that performs better in the validation sets. With regards to the bounding box predictions of the machine learning models, the control unit 106 averages the coordinates or uses median values. With regards to the pixel-level segmentation masks, the control unit 106 merges the masks or selects the most efficient mask. As a result, the score is determined based on a confidence threshold or majority votes among the one or more machine learning models.
[040] The control unit 106 is configured to select one of the at least one potential parking space as the parking space for the vehicle 102 based on the determined score. The potential parking space with the highest score is selected by the control unit 106. The control unit 106 is configured to generate a traversal path based on the estimated vehicle orientation, and the dimensions of the vehicle 102. The traversal path is adapted to route the vehicle 102 towards the determined parking space for parking the vehicle 102.
[041] In an embodiment, the traversal path is generated using a rule-based path generation method by the control unit 106. Training data corresponding to labelled examples of parking manoeuvres is utilized to generate the traversal path. The training data comprises initial conditions such as the vehicle orientation, the one or more objects, and the dimensions of the parking spaces. The training data also comprises successful traversal paths that successfully park the vehicle 102. The control unit 106 deploys algorithms such as decision trees, reinforcement learning, and neural networks using the training data, thereby generating the traversal path to the determined parking space.
[042] In an embodiment, the traversal path is generated using imitation learning method by the control unit 106. The imitation learning method comprises neural networks that learn from successful demonstrations of parking manoeuvres which include initial conditions and successful traversal paths. The initial conditions comprise the vehicle orientation, the one or more objects, and the dimensions of the parking spaces. The neural networks imitate the parking manoeuvres by mapping the initial conditions and the successful traversal paths, thereby generating the traversal path to the determined parking space.
[043] The system 100 further comprises an instrument cluster 124 disposed in the vehicle 102. The instrument cluster 124 is communicatively coupled to the control unit 106. In one embodiment, the instrument cluster 124 and the control unit 106 are coupled via Wi-Fi. The instrument cluster 124 comprises one or more alerting devices (not shown) that are communicatively coupled to the control unit 106. The one or more alerting devices comprises a display unit (not shown) that is coupled wirelessly or by wire to the control unit 106. In one example, the one or more alerting devices comprise at least one of the display unit, an audible device, and a haptic device located on at least one of a rider seat, the handlebar, and a foot peddle, and the like.
[044] The instrument cluster 124 is configured to alert the user of the vehicle 102 upon detecting the parking space for the vehicle 102. The instrument cluster 124 is configured to display at least one of the one or more objects, the local map, and the traversal path to the user of the vehicle 102. In an embodiment, the displayed one or more objects are labelled by the control unit 106. In one example, the labelled one or more objects correspond to humans, other vehicles, trees, walls, roads, and the like.
[045] The control unit 106 is communicatively coupled to an engine management system (EMS) 122 of the vehicle 102. The EMS 122 is disposed in the vehicle 102. The EMS 122 is communicatively coupled to various sensors and parts (not shown) of the vehicle 102. The EMS 122 is adapted to procure one or more vehicle parameters of the vehicle 102. The one or more vehicle parameters comprise a speed of the vehicle 102, a throttle opening position of a throttle body of the vehicle 102, an engine speed, a charge level of a battery in the vehicle 102, an ignition-OFF condition of the vehicle 102, and an electric start status of the vehicle 102.
[046] The control unit 106 is configured to operate the one or more vehicle parameters to autonomously traverse the vehicle 102 to the determined parking space, thereby providing riding assistance for parking the vehicle 102. The control unit 106 performs motorized steering that controls direction and angle of the vehicle 102 to follow the traversal path accurately. In an embodiment, the vehicle 102 comprises a balancing system (not shown) communicatively coupled with the control unit 106. The balancing system is coupled to at least one of a prime mover (not shown) of the vehicle (102), the steering unit, and the EMS 122. During autonomous traversal to the determined parking space, the control unit 106 operates the motorized steering and the balancing system along with the one or more vehicle parameters.
[047] In an embodiment, the control unit 106 monitors the vehicle orientation in real-time to autonomously traverse towards the determined parking space accurately. The control unit 106 adjusts the vehicle orientation when deviations in the traversal path are detected to align the vehicle 102 as required. The vehicle 102 is capable of autonomously traversing in forward and reverse directions towards the determined parking space. As a result, the system 100 provides riding assistance during a forward parking condition and the reverse parking condition. In yet another embodiment, the system 100 corresponds to advanced rider assistance system (ARAS) that provides riding assistance in the vehicle 102.
[048] The system 100 comprises a parking switch 126 disposed in the vehicle 102. In an embodiment, the system 100 is adapted to provide riding assistance for parking the vehicle 102 when the parking switch 126 is operated to an ON condition by the user. In another embodiment, the user can override the system 100 by operating the parking switch 126 to an OFF condition to manually park the vehicle 102. The parking switch 126 may be disposed in one of the instrument cluster, the display unit, and the vehicle 102.
[049] The system 100 further comprises a database 128. The database 128 is communicatively coupled to the control unit 106. The database 128 is configured to procure the information pertaining to the surroundings of the vehicle 102. In an embodiment, the database 128 is a server. In another embodiment, the system 100 provides riding assistance to autonomously traverse towards the determined parking space based on the information procured from the database 128. The database 128 may be communicably coupled to the control unit 106, for storing data processed, received or transmitted by the control unit 106 while determining the parking space for the vehicle 102.
[050] In an embodiment, the control unit 106 and/or the EMS 122 are embodied as a multi-core processor, a single core processor, or a combination of one or more multi-core processors and one or more single core processors. For example, the control unit 106 and the EMS 122 are embodied as one or more of various processing devices, such as a coprocessor, a microprocessor, a controller, a digital signal processor (DSP), a processing circuitry with or without an accompanying DSP, or various other processing devices including integrated circuits such as, for example, an application specific integrated circuit (ASIC), a field programmable gate array (FPGA), a microcontroller unit (MCU), a hardware accelerator, a special-purpose computer chip, or the like. In another embodiment, the control unit 106 and the EMS 122 are configured to execute hard-coded functionality.
[051] The control unit 106 and the EMS 122 comprise a storage unit (not shown). The storage unit of the control unit 110 and the EMS 122 may include a memory. The memory may be a main memory, a static memory, or a dynamic memory. The memory may include but is not limited to computer readable storage media such as various types of volatile and non-volatile storage media, including but not limited to random access memory, read-only memory, programmable read-only memory, electrically programmable read-only memory, electrically erasable read-only memory, flash memory, magnetic tape or disk, optical media and the like. The memory is operable to store instructions executable by the processor. The functions, acts or tasks illustrated in the figures or described may be performed by the programmed processor executing the instructions stored in the memory.
[052] In an embodiment, the control unit 106 is adapted to monitor and procure data pertaining to the surroundings up to a predetermined distance from the vehicle 102. As an example, the control unit 106 is adapted to monitor and procure data pertaining to the surroundings up to a distance of 50 meters from the vehicle 102.
[053] Figures 3A-3D are a flow chart illustrating a method 300 for providing riding assistance in the vehicle 102, in accordance with an embodiment of the present invention. In order to perform the method 300, a system such as the system 100 as shown in Figure 2 may be used.
[054] At step 302, the sensing unit 104 procures information pertaining to surroundings of the vehicle 102. The information pertaining to the surroundings of the vehicle 102 comprises at least one of the radar information, the image information, the surface information, the distance information, and the obstacle information as explained in description of Figure 2.
[055] The steering angle sensor 118 procures the steering angle data of the vehicle 102 at step 304. The steering angle data is indicative of the direction and/or the rotation angle of the steering unit, with respect to the central axis (not shown) of the steering unit. The roll angle sensor 120 procures the roll angle data of the vehicle 102 at step 306. The roll angle data is indicative of a roll of the vehicle 102. The control unit 106 is coupled wirelessly or by wire with the sensing unit 104, the steering angle sensor 118 and the roll angle sensor 120. At step 308, the control unit 106 receives at least one of the steering angle data and the roll angle data from the steering angle sensor 118 and the roll angle sensor 120.
[056] At step 310, the control unit 106 estimates the vehicle orientation based on the steering angle data and the roll angle data. In one embodiment, the vehicle orientation may be inclination of the vehicle 102 in the left direction or the right about the vertical axis Y-Y’.
[057] At step 312, the control unit 106 receives information pertaining to the surroundings of the vehicle 102 from the sensing unit 104 mounted on the vehicle 102. The control unit 106 then determines the one or more objects based on the information pertaining to the rearward view of the vehicle 102 at step 314. The one or more objects includes vehicles, people, walls, and the like. In an embodiment, the control unit 106 uses the computer vision techniques to detect the one or more objects in the surroundings of the vehicle 102 as described in description pertaining to Figure 2.
[058] At step 316, the control unit 106 generates the local map pertaining to the one or more objects around the vehicle 102 as described in description of Figure 2. The control unit 106 is configured to continuously update the local map based on real-time information pertaining to the one or more objects in the surroundings of the vehicle 102. At step 318, the control unit 106 receives the information pertaining to the surroundings of the vehicle 102, the one or more objects, the estimated vehicle orientation, the local map, and the dimensions of the vehicle 102.
[059] The control unit 106 analyzes the information, the one or more objects, the estimated vehicle orientation, the local map, and the dimensions pertaining to the vehicle 102 to determine at least one potential parking space for the vehicle 102. At step 320, the control unit 106 segments the local map into the at least one potential parking space for the vehicle 102. For segmentation, the control unit 106 distinguishes between the at least one potential parking space using one or more machine learning models as described in description of Figure 2.
[060] At step 322, the control unit 106 determines the score for each of the at least one potential parking space. The score is determined based on the confidence threshold or the majority votes among the one or more machine learning models or factors pertaining to each of the at least one potential parking spaces such as lighting, background etc. At step 324, the control unit 106 selects one of the at least one potential parking space from the local map as the parking space for the vehicle 102 based on the determined score. The potential parking space with the highest score is selected by the control unit 106 as the parking space for the vehicle 102. Thus, the control unit 106 determines the parking space based on the local map at step 326.
[061] At step 328, the control unit 106 generates the traversal path based on the estimated vehicle orientation, the local map, and dimensions of the vehicle 102. The traversal path is adapted to route the vehicle 102 towards the determined parking space for parking the vehicle 102. In an embodiment, the traversal path is generated using a rule-based path generation method by the control unit 106. In another embodiment, the traversal path is generated using imitation learning method by the control unit 106.
[062] At step 330, the instrument cluster 124 alerts the user of the vehicle 102 upon detecting the parking space for the vehicle 102. At step 332, the instrument cluster 124 displays the one or more objects, the local map, and the traversal path to the user of the vehicle 102.
[063] The control unit 106 is communicatively coupled to the EMS 122 of the vehicle 102. The EMS 122 is communicatively coupled to the various sensors and parts of the vehicle 102. At step 334, the EMS 122 procures one or more vehicle parameters of the vehicle 102. At step 336, the control unit 106 operates the one or more vehicle parameters to autonomously traverse the vehicle 102 to the determined parking space, thereby providing riding assistance for parking the vehicle 102. During autonomous traversal to the determined parking space, the control unit 106 operates the steering unit and the balancing system along with the one or more vehicle parameters. In an embodiment, the steering unit is coupled to an electric motor (not shown) and thus the control unit 106 controls operation of the electric motor for controlling the steering angle of the vehicle 102 during autonomous parking. The vehicle 102 is capable of autonomously traversing in forward and reverse directions towards the determined parking space. As a result, the system 100 provides riding assistance during forward parking condition and reverse parking condition.
[064] In an embodiment, the control unit 106 proceeds to autonomously park the vehicle 102 in the parking space when the parking switch 126 of the vehicle 102 is in ON condition. During the OFF condition, the control unit 106 may merely indicate the parking space to the rider, for manually parking the vehicle 102.
[065] Figure 4 is a flow chart illustrating a method 400 for providing riding assistance in the vehicle 102, in accordance with another embodiment of the present invention. At step 402, the control unit 106 generates the local map pertaining to one or more objects around the vehicle 102. The local map corresponds to the one or more objects in the surroundings, the distance information, and the obstacle distance information. At step 404, the control unit 106 determines the parking space based on the local map. The determination of the parking space is described in description of Figure 2. At step 406, the control unit 106 operates the one or more vehicle parameters to autonomously traverse the vehicle 102 to the determined parking space, thereby providing riding assistance for parking the vehicle 102.
[066] The claimed invention as disclosed above is not routine, conventional or well understood in the art, as the claimed aspects enable the following solutions to the existing problems in conventional technologies. Specifically, the claimed aspect of determining a parking space based on the local map, the local map pertaining to one or more objects, the distance information and the obstacle distance information around the vehicle. Additionally, the system operates one or more vehicle parameters to autonomously traverse the saddle type vehicle to the determined parking space in forward parking condition as well as reverse parking condition. The saddle type vehicle of the present invention ensures no blind spots based on the information pertaining to the surroundings that may be missed by the user. The present invention is capable of contextual detection as a result of determining one or more objects from the information pertaining to the surroundings. As a result, safety of the vehicle and the user is increased due to automatic parking of the saddle type vehicle. Further, the user saves time spent parking the saddle type vehicle.
[067] In light of the abovementioned advantages and the technical advancements provided by the disclosed system and method, the claimed steps as discussed above are not routine, conventional, or well understood in the art, as the claimed steps enable the following solutions to the existing problems in conventional technologies. Further, the claimed steps clearly bring an improvement in the functioning of the system itself as the claimed steps provide a technical solution to a technical problem.
[068] Furthermore, one or more computer-readable storage media may be utilized in implementing embodiments consistent with the present disclosure. A computer-readable storage medium refers to any type of physical memory on which information or data readable by a processor may be stored. Thus, a computer-readable storage medium may store instructions for execution by one or more processors, including instructions for causing the processor(s) to perform steps or stages consistent with the embodiments described herein. The term “computer-readable storage medium” should be understood to include tangible items and exclude carrier waves and transient signals, i.e., be non-transitory. Examples include random access memory (RAM), read-only memory (ROM), volatile memory, non-volatile memory, hard drives, CD ROMs, DVDs, flash drives, disks, and any other known physical storage media”.
[069] While the present invention has been described with respect to certain embodiments, it will be apparent to those skilled in the art that various changes and modification may be made without departing from the scope of the invention as defined in the following claims.

List of Reference Numerals
100 – System
102 – Vehicle
104 – Sensing unit
106 – Control unit
108 – RADAR unit
110 – Image sensors
112 – LIDAR unit
114 – Ultrasonic sensors
116 – Proximity sensors
118 – Steering angle sensor
120 – Roll angle sensor
122 – Engine Management System
124 – Instrument cluster
126 – Parking switch
128 – Database
, Claims:1. A system (100) for providing riding assistance in a saddle type vehicle (102), the system (100) comprising:
a control unit (106), the control unit (106) being configured to:
generate, a local map pertaining to one or more objects around the saddle type vehicle (102);
determine, a parking space based on the local map; and
operate, one or more vehicle parameters to autonomously traverse the saddle type vehicle (102) to the determined parking space, thereby providing riding assistance for parking the saddle type vehicle (102).

2. The system (100) as claimed in claim 1 comprising a sensing unit (104) being mounted on the saddle type vehicle (102) and being communicatively coupled to the control unit (106), the sensing unit (104) being configured to procure information pertaining to surroundings of the saddle type vehicle (102), wherein the sensing unit (104) comprises:
a range detection and ranging (RADAR) unit (108), the RADAR unit (108) being configured to generate radar information pertaining to the surroundings of the saddle type vehicle (102);
one or more image sensors (110), the one or more image sensors (110) being configured to generate image information pertaining to the surroundings of the saddle type vehicle (102);
a light detection and ranging (LIDAR) unit (112), the LIDAR unit (112) being configured to generate surface information pertaining to the surroundings of the saddle type vehicle (102);
one or more ultrasonic sensors (114), the one or more ultrasonic sensors (114) being configured to generate distance information pertaining to a distance between the saddle type vehicle (102) and the surroundings; and
one or more proximity sensors (116), the one or more proximity sensors (116) being configured to generate obstacle distance information pertaining to a proximity of the saddle type vehicle (102) from obstacles in the surroundings.

3. The system (100) as claimed in claim 2, wherein the control unit (106) being configured to:
receive, the information pertaining to the surroundings of the saddle type vehicle (102) from the sensing unit (104); and
determine, the one or more objects based on the information pertaining to the surroundings of the saddle type vehicle (102).

4. The system (100) as claimed in claim 3 comprising:
a steering angle sensor (118) being disposed in the saddle type vehicle (102), the steering angle sensor (118) being configured to procure steering angle data of the saddle type vehicle (102); and
a roll angle sensor (120) being disposed in the saddle type vehicle (102), the roll angle sensor (120) being configured to procure roll angle data of the saddle type vehicle (102).

5. The system (100) as claimed in claim 4, wherein the control unit (106) being configured to:
receive, at least one of the steering angle data and the roll angle data from the steering angle sensor (118) and the roll angle sensor (120);
estimate, a vehicle orientation based on at least one of the steering angle data and the roll angle data; and
generate, a traversal path based on the estimated vehicle orientation, the local map, and dimensions of the saddle type vehicle (102), the traversal path being adapted to route the saddle type vehicle (102) towards the determined parking space for parking the saddle type vehicle (102).

6. The system (100) as claimed in claim 5, wherein the control unit (106) is communicatively coupled to an engine management system (EMS) (122), the EMS (122) being adapted to procure the one or more vehicle parameters of the saddle type vehicle (102), wherein the one or more vehicle parameters comprises a speed of the saddle type vehicle (102), a throttle opening position of a throttle body of the saddle type vehicle (102), an engine speed, a charge level of a battery in the saddle type vehicle (102), an ignition-OFF condition of the saddle type vehicle (102), and an electric start status of the saddle type vehicle (102).

7. The system (100) as claimed in claim 5, wherein the control unit (106) being configured to:
receive, the information pertaining to the surroundings of the saddle type vehicle (102), the one or more objects, the estimated vehicle orientation, the local map and the dimensions of the saddle type vehicle (102);
segment, the local map into at least one potential parking space for the saddle type vehicle (102);
determine, a score for each of the at least one potential parking space; and
select, one of the at least one potential parking space as the parking space for the saddle type vehicle (102) based on the determined score.

8. The system (100) as claimed in claim 5 comprising an instrument cluster (124), the instrument cluster (124) being disposed in the saddle type vehicle (102), the instrument cluster (124) being communicatively coupled to the control unit (106), the instrument cluster (124) being configured to:
alert, a user of the saddle type vehicle (102) upon detecting the parking space for the saddle type vehicle (102); and
display, the one or more objects, the local map, and the traversal path to the user of the saddle type vehicle (102).

9. The system (100) as claimed in claim 1 comprising a parking switch (126) disposed in the saddle type vehicle (102), the system (100) being adapted to provide riding assistance for parking the saddle type vehicle (102), when the parking switch (126) is operated to an ON condition.

10. The system (100) as claimed in claim 1 being adapted to provide riding assistance for parking the saddle type vehicle (102) during a reverse parking condition of the saddle type vehicle (102).

11. The system (100) as claimed in claim 1 comprising a database (128), the database (128) being communicatively coupled to the control unit (106), the database (128) being configured to procure information pertaining to surroundings of the saddle type vehicle (102).

12. A method (400) for providing riding assistance in a saddle type vehicle (102), the method (400) comprising:
generating (402), by a control unit (106), a local map pertaining to one or more objects around the saddle type vehicle (102);
determining (404), by the control unit (106), a parking space based on the local map; and
operating (406), by the control unit (106), one or more vehicle parameters to autonomously traverse the saddle type vehicle (102) to the determined parking space, thereby providing riding assistance for parking the saddle type vehicle (102).

13. The method (400) as claimed in claim 12 comprising procuring, by a sensing unit (104) being mounted on the saddle type vehicle (102) and communicatively coupled to the control unit (106), information pertaining to surroundings of the saddle type vehicle (102), wherein:
generating, by a range detection and ranging (RADAR) unit (108), radar information pertaining to the surroundings of the saddle type vehicle (102);
generating, by one or more image sensors (110), image information pertaining to the surroundings of the saddle type vehicle (102);
generating, by a light detection and ranging (LIDAR) unit (112), surface information pertaining to the surroundings of the saddle type vehicle (102);
generating, by one or more ultrasonic sensors (114), distance information pertaining to a distance between the saddle type vehicle (102) and the surroundings; and
generating, by one or more proximity sensors (116), obstacle distance information pertaining to a proximity of the saddle type vehicle (102) from obstacles in the surroundings.

14. The method (400) as claimed in claim 13 comprising:
receiving, by the control unit (106), the information pertaining to the surroundings of the saddle type vehicle (102) from the sensing unit (104); and
determining, by the control unit (106), the one or more objects based on the information pertaining to the surroundings of the saddle type vehicle (102).

15. The method (400) as claimed in claim 14 comprising:
procuring, by a steering angle sensor (118) disposed in the saddle type vehicle (102), steering angle data of the saddle type vehicle (102); and
procuring, by a roll angle sensor (120) disposed in the saddle type vehicle (102), roll angle data of the saddle type vehicle (102).

16. The method (400) as claimed in claim 15 comprising:
receiving, by the control unit (106), at least one of the steering angle data and the roll angle data from the steering angle sensor (118) and the roll angle sensor (120);
estimating, by the control unit (106), a vehicle orientation based on at least one of the steering angle data and the roll angle data; and
generating, by the control unit (106), a traversal path based on the estimated vehicle orientation, the local map, and dimensions of the saddle type vehicle (102), the traversal path being adapted to route the saddle type vehicle (102) towards the determined parking space for parking the saddle type vehicle (102).

17. The method (400) as claimed in claim 16 comprising procuring by an engine management system (EMS) (122) communicatively coupled to the control unit (106), the one or more vehicle parameters of the saddle type vehicle (102), wherein the one or more vehicle parameters comprises a speed of the saddle type vehicle (102), a throttle opening position of a throttle body of the saddle type vehicle (102), an engine speed, a charge level of a battery in the saddle type vehicle (102), an ignition-OFF condition of the saddle type vehicle (102), and an electric start status of the saddle type vehicle (102).

18. The method (400) as claimed in claim 16 comprising:
receiving, by the control unit (106), the information pertaining to the surroundings of the saddle type vehicle (102), the one or more objects, the estimated vehicle orientation, the local map and the dimensions of the saddle type vehicle (102);
segmenting, by the control unit (106), the local map into at least one potential parking space for the saddle type vehicle (102);
determining, by the control unit (106), a score for each of the at least one potential parking space; and
selecting, by the control unit (106), one of the at least one potential parking space from the local map as the parking space for the saddle type vehicle (102) based on the determined score.

19. The method (400) as claimed in claim 16 comprising:
alerting, by an instrument cluster (124) disposed in the saddle type vehicle (102) and communicatively coupled to the control unit (106), a user of the saddle type vehicle (102) upon detecting the parking space for the saddle type vehicle (102); and
displaying, by the instrument cluster (124), the one or more objects, the local map, and the traversal path to the user of the saddle type vehicle (102).

20. The method (400) as claimed in claim 12 comprising providing, by the method (400), riding assistance for parking the saddle type vehicle (102), when a parking switch (126) is operated to an ON condition, the parking switch (126) being disposed in the saddle type vehicle (102).

21. The method (400) as claimed in claim 12 comprising providing, riding assistance for parking the saddle type vehicle (102) during one of a forward riding condition and a reverse riding condition of the saddle type vehicle (102).

22. The method (400) as claimed in claim 12 comprising procuring, by a database (128) communicatively coupled to the control unit (106), information pertaining to surroundings of the saddle type vehicle (102).

Documents

Application Documents

# Name Date
1 202341059717-STATEMENT OF UNDERTAKING (FORM 3) [05-09-2023(online)].pdf 2023-09-05
2 202341059717-REQUEST FOR EXAMINATION (FORM-18) [05-09-2023(online)].pdf 2023-09-05
3 202341059717-PROOF OF RIGHT [05-09-2023(online)].pdf 2023-09-05
4 202341059717-POWER OF AUTHORITY [05-09-2023(online)].pdf 2023-09-05
5 202341059717-FORM 18 [05-09-2023(online)].pdf 2023-09-05
6 202341059717-FORM 1 [05-09-2023(online)].pdf 2023-09-05
7 202341059717-FIGURE OF ABSTRACT [05-09-2023(online)].pdf 2023-09-05
8 202341059717-DRAWINGS [05-09-2023(online)].pdf 2023-09-05
9 202341059717-DECLARATION OF INVENTORSHIP (FORM 5) [05-09-2023(online)].pdf 2023-09-05
10 202341059717-COMPLETE SPECIFICATION [05-09-2023(online)].pdf 2023-09-05
11 202341059717-Covering Letter [12-06-2024(online)].pdf 2024-06-12