Abstract: ACOUSTIC VEHICLE ALERT SYSTEM AND RELATED METHOD THEREOF The present invention relates to an acoustic vehicle alert system (AVAS) (100) for a vehicle. The system (100) includes an ADAS front controller (109) configured for processing a first predetermined data, an ADAS rear controller (110) configured for processing a second predetermined data, an infotainment control module (111) configured for processing a third predetermined data and a vehicle speed sensor (118) configured to generate a predefined signals indicative of vehicle speed. An AVAS control module (112) communicatively coupled to the ADAS front controller (109), the ADAS rear controller (110), the infotainment control module (111), and the vehicle speed sensor (118), the AVAS control module (112) configured to execute the first, second and third predetermined data through a AI module (114) to generate a predefined acoustic alert signal when that the vehicle speed is lesser than a predetermined vehicle speed and each of the first, second and third predetermined data corresponds to a predetermined threshold. Figure 1
FORM 2
THE PATENTS ACT 1970
[39 OF 1970]
AND
THE PATENTS RULES, 2003
COMPLETE SPECIFICATION
[See section 10; rule 13]
TITLE OF THE INVENTION
ACOUSTIC VEHICLE ALERT SYSTEM AND RELATED METHOD THEREOF
APPLICANT(S)
TATA MOTORS PASSENGER VEHICLES LIMITED
an Indian company having its registered office at
Floor 3, 4, Plot-18, Nanavati Mahalaya,
Mudhana Shetty Marg, BSE, Fort,
Mumbai 400 001, Mumbai City, Maharashtra, India.
TATA MOTORS EUROPEAN TECHNICAL CENTRE
18 Grosvenor Place, London, SW1X 7HS, United Kingdom, Nationality United Kingdom.
PREAMBLE TO THE DESCRIPTION
The following specification particularly describes the invention and the manner in which it is to be performed.
TECHNICAL FIELD OF THE INVENTION
Present disclosure, in general, relates to a field of automobiles. Particularly, but not exclusively, the present disclosure relates to an acoustic vehicle alert system and related method thereof.
BACKGROUND OF THE INVENTION
With the rising demand for alternative modes of transportation that are environmentally friendly and to improve performance, increase fuel economy and reduce emissions from conventional internal combustion engines, electric drive vehicles, such as hybrid electric vehicles (HEVs), plug-in hybrid electric vehicles (PHEVs), and all-electric vehicles (EVs), hybrid electric vehicles have been developed.
Typically, the electric drive vehicles produces much less noise than conventional internal combustion engines vehicles and hence are often too quiet to be heard, especially at low speeds. As a result, pedestrians particularly, older adults, children, blind people walking on a roadway are unable to detect an approaching vehicle and are at high risk of vehicle-pedestrian collisions and accident.
In an effort to substantially reduce the number of accidents caused by non-detection of approaching vehicles by the pedestrian, various government regulations have been brought into force which mandates that all non-IC engine vehicles should include acoustic vehicle alert system that should generate certain decibel of sound signal in the form of the audio alert while vehicle is moving in forward direction or reverse direction and with a vehicle speed up to 20KM/H. Thus, various types of pedestrian alert systems are developed by vehicle manufacturers to alert the vehicles of the surrounding vehicles and pedestrian of potential accidents, collision.
An exisiting vehicle pedestrian alert system installed on the vehicle is configured to automatically generate audio alert signals in a predefined interval of times,
without determining the presence of pedestrian, a pedestrian-like objects in a vehicle pathway. This may sometime result in undesirable noise creation. Furthermore, the audio alert signals transmitted by speakers installed on the vehicle are particularly at a predetermined constant decibel value and hence may remain unheard by pedestrian, particularly older adults, blind people or animal in the vehicle pathway. Thus, the current acoustic vehicle alert system is an ineffective in providing an accurate alert signals to substantially reduce the potential vehicle-pedestrians collisions.
Present disclosure is directed to overcome one or more limitations stated above or any other limitations associated with the known arts.
SUMMARY OF THE INVENTION
One or more shortcomings of the prior art are overcome by a system as claimed and additional advantages are provided through the device and a system as claimed in the present disclosure. Additional features and advantages are realized through the techniques of the present disclosure. Other embodiments and aspects of the disclosure are described in detail herein and are considered a part of the claimed disclosure.
In one non-limiting embodiment of the disclosure, an acoustic vehicle alert system (AVAS) for a vehicle includes an ADAS front controller configured for processing a first predetermined data. The first predetermined data corresponds to a cumulative data generated from a front radar and a front camera module mounted in at least a front portion of the vehicle. Further, the acoustic vehicle alert system (AVAS) further includes an ADAS rear controller configured for processing a second predetermined data the second predetermined data corresponds to a cumulative data generated from a left-hand rear radar and the right -hand rear radar mounted to a left-hand side and right-hand side of the vehicle, respectively. Also, the acoustic vehicle alert system (AVAS) includes an infotainment control module configured for processing a third predetermined data, the third predetermined data corresponds
to a cumulative data generated from a plurality of modules including an ambient sensor, a user select module and a microphone. A vehicle speed sensor mounted on at least a portion of the vehicle is configured to generate a predefined signal indicative of vehicle speed. Further, in an embodiment the acoustic vehicle alert system (AVAS) includes an AVAS control module communicatively coupled to the ADAS front controller, the ADAS rear controller, the infotainment control module, and the vehicle speed sensor. The AVAS control module configured to execute the first, second and third predetermined data through an Artificial Intelligence (AI) module to generate a predefined acoustic alert signal when the vehicle speed is lesser than a predetermined vehicle speed and each of the first, second and third predetermined data corresponds to a predetermined threshold.
In an embodiment, the ambient sensor (106) is configured for generating a plurality of ambient signals corresponding to a plurality of vehicle surrounding conditions; and wherein the plurality of vehicle surrounding conditions refers to a daytime condition, night time condition.
In an embodiment, the AI module includes a pre-stored data points stored in a knowledge base therein. The pre-stored data points corresponds to a plurality of data points indicative of alert signal generation.
In an embodiment, the AI module includes a deep learning algorithm and a Kalman filter. The AI module is configured to perform data matching of the first, second and third predetermined data and the vehicle speed data with corresponding pre-stored data points in the AI module.
In an embodiment, the predetermined threshold corresponds to a plurality of data values for the first, second and third predetermined data. The predefined acoustic alert signal is generated when the plurality of data values equals to a plurality of a pre-stored data points in the AI module during execution of the first, second and third predetermined data by the AVAS control module.
In an embodiment, the at least one of first, second and third predetermined data and the vehicle speed data generates an acoustic alert signal.
In an embodiment, upon generation of the acoustic alert signal, the AVAS control module sends an actuation signal to at least one of the front and rear speakers. The front speaker is located on at least a front portion of the vehicle and the rear speaker is located on at least a rear portion of the vehicle. The front and rear speakers are integrally built with an electric motor.
In an embodiment, the electric motors is configured to drive the at least one of the front and rear speakers.
In an embodiment, the front radar is mounted on at least a front portion of the vehicle and configured for detecting objects in a vehicle path to the front portion of the vehicle.
In an embodiment, the front camera module mounted in the at least a front portion of the vehicle and configured to capture images of the objects and pedestrians located to the front of the vehicle at a predetermined distance from the vehicle.
In an embodiment, the left and right hand side rear radars are configured for capturing images of a rear region of the vehicle.
In an embodiment, an ambient sensor is configured for generating a plurality of ambient signals corresponding to a plurality of vehicle surrounding conditions.
In an embodiment, upon generation of the acoustic alert signal, the AVAS control module sends an actuation signal to at least one of the front and rear speakers mounted on the vehicle. The front speaker is located on at least a front portion of the vehicle and the rear speaker is located on at least a rear portion of the vehicle.
In an embodiment, upon actuation, the at least one of the front and rear speakers are oriented to a predetermined angle in order to be directed towards the pedestrian. The at least one of the front and rear speakers generates an audio messages corresponding to the acoustic alert signal to provide real-time alert information to pedestrian.
In an embodiment, the acoustic vehicle alert system (AVAS) includes a user interface such as user select module capable of enabling user settings and preferences. The user select module is integrally provided with a sound profile logic, thereby allowing the user of the vehicle to select a predetermined sound profile mode from a plurality of sound profile options provided in the sound profile logic.
An object of the present disclosure is to provide an improved acoustic vehicle alert system (AVAS) that is configured to modulate the decibel values of the generated acoustic sound signal based on the determined ambient condition.
Another object of the present invention is to provide the improved and effective acoustic vehicle alert system (AVAS) being configured to determine a predetermined direction for orienting the at least one of front and rear speakers to directing the transmission of the audio alert corresponding. This facilitates in reducing the sound pollution in case there are no pedestrians that need alert and at the same time directing the sound speaker towards the pedestrian, thereby providing an effective alert system of approaching vehicles in pedestrian pathway.
The foregoing summary is illustrative only and is not intended to be in any way limiting. In addition to the illustrative aspects, embodiments, and features described above, further aspects, embodiments, and features will become apparent by reference to the drawings and the following detailed description.
BRIEF DESCRIPTION OF THE ACCOMPANYING DRAWINGS
The novel features and characteristics of the disclosure are set forth in the appended claims. The disclosure itself, however, as well as a preferred mode of use, further objectives, and advantages thereof, will best be understood by reference to the following detailed description of an illustrative embodiments when read in conjunction with the accompanying figures. One or more embodiments are now described, by way of example only, with reference to the accompanying figures wherein like reference numerals represent like elements and in which:
Figure. 1 illustrates an acoustic vehicle alert system (AVAS) for a vehicle, in accordance with an embodiment of the present disclosure.
Figure. 2 shows a flow-chart illustrating a method of operating the acoustic vehicle alert system (AVAS) for a vehicle, in accordance with an embodiment of the present disclosure.
The figures depict embodiments of the disclosure for purposes of illustration only. One skilled in the art will readily recognize from the following description that alternative embodiments of the system and method illustrated herein may be employed without departing from the principles of the disclosure described herein.
DETAILED DESCRIPTION OF THE INVENTION
The foregoing has broadly outlined the features and technical advantages of the present disclosure in order that the detailed description of the disclosure that follows may be better understood. Additional features and advantages of the disclosure will be described hereinafter which forms the subject of the claims of the disclosure. It should be appreciated by those skilled in the art that, the conception and specific embodiments disclosed may be readily utilized as a basis for modifying other devices, systems, assemblies and mechanisms for carrying out the same purposes of the present disclosure. It should also be realized by those skilled in the art that,
such equivalent constructions do not depart from the scope of the disclosure as set forth in the appended claims. The novel features which are believed to be characteristics of the disclosure, to its device or system, together with further objects and advantages will be better understood from the following description when considered in connection with the accompanying figures. It is to be expressly understood, however, that each of the figures is provided for the purpose of illustration and description only and is not intended as a definition of the limits of the present disclosure.
The terms “comprises”, “comprising”, or any other variations thereof, are intended to cover a non-exclusive inclusions, such that a system or a device that comprises a list of components or steps does not include only those components or steps but may include other components or steps not expressly listed or inherent to such setup or device. In other words, one or more elements in a system or apparatus proceeded by “comprises… a” does not, without more constraints, preclude the existence of other elements or additional elements in the system or apparatus.
Reference will now be made to the exemplary embodiments of the disclosure, as illustrated in the accompanying drawings. Wherever possible, same numerals have been used to refer to the same or like parts. The following paragraphs describe the present disclosure with reference to Figs. 1-2. It is to be noted that the system may be employed in any vehicle including but not limited to a passenger vehicle, a utility vehicle, commercial vehicles, and any other vehicle including a battery power source, an electric drive motor. For a sake of clarity, a vehicle is not shown.
Figure 1 is an acoustic vehicle alert system (AVAS) (100) for a vehicle (not shown), in accordance with an embodiment of the present disclosure. In an illustrated embodiment, the acoustic vehicle alert system (AVAS) (100) includes a front radar (101) mounted on at least a front portion of the vehicle and configured for detecting objects in a vehicle path to a front portion of the vehicle, a front camera module (102) mounted in the at least a front portion of the vehicle and configured to capture images of the objects and pedestrians located to the front of the vehicle
at a predetermined distance from the vehicle. Further, the acoustic vehicle alert system (AVAS) (100) includes a left-hand rear radar (103) laterally mounted in at least a rear portion of the vehicle. Particularly, the left-hand rear radar (103) is located on a left-hand side of a vehicle body. Also, a right –hand rear radar (104) is laterally mounted in at least the rear portion of the vehicle. In particular, the right-hand rear radar (104) is located on a right-hand side of the vehicle body. The left and right hand side rear radars (103), (104) are configured for capturing images of a rear region of the vehicle.
In an illustrated embodiment, the acoustic vehicle alert system (AVAS) (100) further includes an ambient sensor (106) configured for generating a plurality of ambient signals corresponding to a plurality of vehicle surrounding conditions. That is, the plurality of vehicle surrounding conditions may refer to a daytime condition, nighttime condition. Further, the acoustic vehicle alert system (AVAS) (100) includes a user interface such as user select module (107) capable of enabling user settings and preferences. In an embodiment of the present disclosure, the user select module (107) is integrally provided with a sound profile logic, thereby allowing the user of the vehicle to select a predetermined sound profile mode from a plurality of sound profile options provided in the logic. In an example, at least one of the plurality of sound profile options may include a vehicle brand signature sound profile. The user select module (107), in an example, may be integrated with a Human Machine Interface (HMI) module of the vehicle. In another example, the user module may be provided on an instrument panel on the vehicle. Also, the acoustic vehicle alert system (AVAS) (100) include the user interface such as microphone (108) that may be located adjacent to rear view mirrors. The user of the vehicle may select the user interface such as the microphone (108) to communicate with the infotainment control unit (111) of the vehicle.
In an illustrated embodiment, the acoustic vehicle alert system (AVAS) (100) further includes an ADAS front controller (109) and an ADAS rear controller (110). In an embodiment of the present disclosure, a first predetermined data defined as a cumulative predetermined vehicle front data generated from the front radar (101)
and the front camera module (102) is transmitted to an ADAS front controller (109) in a predefined time intervals. Upon receiving, the ADAS front controller (109) performs processing of the cumulative predetermined vehicle front data to generate a map of vehicle ambient, navigation information, objects or pedestrians in a front region of the vehicle. The ADAS front controller (109) is further configured to fetch the processed cumulative predetermined data to the AVAS control module (112) through a communication protocol (115) such as a CAN Bus.
Further, in an illustrated embodiment, a second predetermined data defined as a cumulative predetermined vehicle rear data generated from the left hand side and right hand side rear radars (103), (104), and rear camera module (105) is transmitted to an ADAS rear controller (110). The ADAS rear controller (110) is configured to perform processing of the cumulative predetermined vehicle rear data to generate a map of vehicle ambient, navigation information, objects or pedestrians in a rear region of the vehicle. The ADAS rear controller (110) is further configured to fetch the processed cumulative predetermined data to the AVAS control module (112) through a communication protocol (116) such as a CAN Bus.
In an illustrated embodiment, a third predetermined data defined as a plurality of data corresponding to data from ambient sensor (106), the user select module (107) and the microphone (108) is transmitted to an infotainment control unit (111). The infotainment control unit (111) further transmit the fetched data to the acoustic vehicle alert system (AVAS) (100) through a communication protocol (117) such as a CAN Bus. In an illustrated embodiment, a vehicle speed sensor (118) is communicatively coupled to the AVAS control module (112) and configured to send signals indicative of vehicle speed.
In an illustrated embodiment, the AVAS control module (112) is being electronically coupled to the artificial intelligence module (114). As per illustrated embodiment of the present disclosure, the AVAS control module (112) is configured to execute the first, second and third predetermined data and a vehicle speed data through a learning algorithm integrally built in AI module (114) to
generate a predefined acoustic alert signal when that the vehicle speed is lesser than a predetermined vehicle speed and each of the first, second and third predetermined data corresponds to a predetermined threshold.
Particularly, the AI module (114) comprises a pre-stored data points stored in a knowledge base therein. The pre-stored data points corresponds to a plurality of data points indicative of alert signal generation. Thus, the AI module (114) with the deep learning algorithm and a Kalman filter, performs data matching of the first, second and third predetermined data and the vehicle speed data with corresponding pre-stored data points in the AI module (114). In an illustrating embodiment, matching of the at least one of first, second and third predetermined data and the vehicle speed data generates an acoustic alert signal. Upon generation of the acoustic alert signal, the AVAS control module (112) sends an actuation signal to at least one of the front and rear speakers (119), (121) mounted on the vehicle. In an illustrated embodiment, the front speaker (113) is located on at least a front portion of the vehicle and the rear speaker (115) is located on at least a rear portion of the vehicle. In an illustrated embodiment, the front and rear speakers (119), (121) are integrally built with an electric motor. Thus, upon generation of the predefined activation signal, the electric motors (113), (120) is configured to drive the at least one of the front and rear speakers (119), (121). On actuation, the at least one of the front and rear speakers (119), (121) are oriented to a predetermined angle in order to be directed towards the pedestrian. Further, the at least one of the front and rear speakers generates an audio messages corresponding to the acoustic alert signal to provide real-time alert information to pedestrian and thereby preventing potential vehicle collision, accident. The generation of the audio messages may be equal and greater than a predetermined decibel value or below the predetermined decibel value and based on a plurality of determining factors by the AVAS control module (112). That is, in an example, during determination of the vehicle surrounding condition as daytime condition and detection of at least one pedestrian, the generated audio message decibel may be generated with a decibel equal to or greater than the predetermined decibel value. In another example, during
determination of the vehicle surrounding condition as nighttime condition, the generated audio message decibel may be below the predetermined decibel value.
Moreover, in an embodiment of the present disclosure, the disclosed acoustic vehicle alert system (AVAS) (100) is configured to determine an accurate location of pedestrians, pedestrian like object, objects in real-time, and orients the at least one of the front and rear speakers (119), (121) to the predetermined angle before the acoustic alert signal is transmitted such that the audio alert message of the approaching vehicle can be recognized and heard precisely, thereby significantly reducing accidents, vehicle collision, Advantageously, the orientation of the at least one of the front and rear speakers (119), (121), upon actuation, is configured to alert pedestrians which may include adults, blind-persons, children, dogs, cats, livestock, or other animals. Thereby, facilitating an accurate alert signal transmission and an enhanced acoustic awareness from the vehicle to a surrounding of the vehicle and pedestrian located at the predetermined distance in the real-time. Additionally, an illustrated embodiment discloses an improved acoustic vehicle alert system (AVAS) (100) configured to modulate the decibel values of the generated acoustic sound signal based on the determined ambient condition. Also, the improved acoustic vehicle alert system (AVAS) (100) is configured to determine a predetermined direction for orienting the at least one of front and rear speakers (119), (121) to directing the transmission of the audio alert corresponding. This facilitates in reducing the sound pollution in case there are no pedestrians that need alert and at the same time directing the sound speaker towards the pedestrian, thereby providing an effective alert system of approaching vehicles in pedestrian pathway.
Figure. 2 shows a flow-chart illustrating a method of operating the acoustic vehicle alert system (AVAS) (100) for the vehicle, in accordance with an embodiment of the present disclosure. In an illustrated embodiment, the acoustic vehicle alert system (AVAS) (100) includes the front radar (101) and the front camera module (102), configured to generate the first predetermined data, the rear radars (103), (104) and the rear camera module (105), configured to generate the second
predetermined data, the user interfaces such as the user select module (107), the microphone (106), and the ambient sensor (108), configured to generate the third predetermined data. The ADAS front controller (109) performs processing of the first predetermined data to generate a front vehicle data map with a predefined information of the front region of the vehicle. The ADAS rear controller (110) performs processing of the second predetermined data to generate a rear vehicle data map with a predefined information of the rear region of the vehicle.
And, the infotainment control unit (111) performs processing of the third
predetermined data to generate a cumulative data corresponding to the ambient
sensor and the user interfaces. The AVAS control module (102) being
communicatively coupled to the ADAS front controller (109), the ADAS rear controller (110) and the infotainment control unit (111) is configured to determine and generates the predetermined acoustic alert sound signal to be transmitted to the pedestrian via that at least one of the front and rear speakers (119), (121). Particularly, the AI module (114) coupled to the AVAS unit and integrally built deep learning algorithm and predictive logic creates an actuation signal for the at least one of the front and rear speakers (119), (121). On generation of the actuation signal, the at least one of the front and rear speaker is oriented at the predetermined angle corresponding to the location of the pedestrian from the predetermined distance of the vehicle. Upon orienting, the at least one of the front and rear speakers (119), (121) generates the real-time acoustic alert signal to alert the pedestrian of the approaching vehicle.
With reference to an illustration, the method starts at step 201 when ignition of the vehicle is in the ON condition. In an illustrated embodiment, at step 202, the first, second and third predetermined data is determined by the ADAS front and rear controllers (109), (110) and the infotainment control unit on the vehicle. At step 203, a vehicle control unit determines if the acoustic vehicle alert system is disabled. In an illustrated embodiment, the acoustic vehicle alert system is configured to be automatically activated upon vehicle Ignition ON. In situation, wherein the acoustic vehicle alert system (AVAS) (100) is disabled based on user
preferences through the actuation of an override switch, the vehicle control unit is configured to determine the actuation/deactivation condition of the acoustic vehicle alert system (AVAS) (100) as illustrated at step 203. If the acoustic alert system is in activated condition, then the AVAS control module determines the vehicle speed based on the vehicle speed signal received from the vehicle speed sensor, at step 204. At step 205, if the AVAS control module (102) determines that the vehicle speed is lesser than a predetermined vehicle speed, then the method moves to step 206, else the method moves to step 204. At step 206, the AVAS control module (102) determines and processes the data from at least a portion of the third predetermined data. That is, the AVAS control module (102) determines the ambient condition of the vehicle. At step 208, if it is determined that an ambient condition signal is indicative of the daytime condition, then further at step 209 the AVAS control module (102) determines the presence of at least one pedestrian located at the predetermined distance from the vehicle and transmits a predetermined orientation signal to the at least one of front and rear speakers (119), (121) of the vehicle. At step 210, the at least one of front and rear speakers (119), (121) is oriented at a predetermined angle such that it faces the location of the pedestrian. Further, upon orientation, the at least one of front and rear speakers (119), (121) is actuated by the electric motors (113), (120) to generate in real-time the audio alert messages in the form of the acoustic sound signal to alert the pedestrian in the vicinity of the approaching vehicle, at step 211. The acoustic sound signal generated, as illustrated in step 211 is equal to or greater than a predetermined decibel value. However, if at step 206 if it is determined that an ambient condition signal is indicative of the nighttime condition, then an audio alert messages in the form of the acoustic sound signal is generated in a decibel value below the predetermined value to alert the pedestrian in the vicinity of the approaching vehicle, at step 207.
Equivalents:
With respect to the use of substantially any plural and/or singular terms herein, those having skill in the art can translate from the plural to the singular and/or from
the singular to the plural as is appropriate to the context and/or application. The various singular/plural permutations may be expressly set forth herein for sake of clarity.
It will be understood by those within the art that, in general, terms used herein, and especially in the appended claims (e.g., bodies of the appended claims) are generally intended as “open” terms (e.g., the term “including” should be interpreted as “including but not limited to,” the term “having” should be interpreted as “having at least,” the term “includes” should be interpreted as “includes but is not limited to,” etc.). It will be further understood by those within the art that if a specific number of an introduced claim recitation is intended, such an intent will be explicitly recited in the claim, and in the absence of such recitation no such intent is present. For example, as an aid to understanding, the following appended claims may contain usage of the introductory phrases “at least one” and “one or more” to introduce claim recitations. However, the use of such phrases should not be construed to imply that the introduction of a claim recitation by the indefinite articles “a” or “an” limits any particular claim containing such introduced claim recitation to inventions containing only one such recitation, even when the same claim includes the introductory phrases “one or more” or “at least one” and indefinite articles such as “a” or “an” (e.g., “a” and/or “an” should typically be interpreted to mean “at least one” or “one or more”); the same holds true for the use of definite articles used to introduce claim recitations. In addition, even if a specific number of an introduced claim recitation is explicitly recited, those skilled in the art will recognize that such recitation should typically be interpreted to mean at least the recited number (e.g., the bare recitation of “two recitations,” without other modifiers, typically means at least two recitations, or two or more recitations). Furthermore, in those instances where a convention analogous to “at least one of A, B, and C, etc.” is used, in general such a construction is intended in the sense one having skill in the art would understand the convention (e.g., “a system having at least one of A, B, and C” would include but not be limited to systems that have A alone, B alone, C alone, A and B together, A and C together, B and C together, and/or A, B, and C together, etc.). In those instances where a convention analogous
to "at least one of A, B, or C, etc." is used, in general such a construction is intended in the sense one having skill in the art would understand the convention (e.g., "a system having at least one of A, B, or C" would include but not be limited to systems that have A alone, B alone, C alone, A and B together, A and C together, B and C together, and/or A, B, and C together, etc.). It will be further understood by those within the art that virtually any disjunctive word and/or phrase presenting two or more alternative terms, whether in the description, claims, or drawings, should be understood to contemplate the possibilities of including one of the terms, either of the terms, or both terms. For example, the phrase "A or B" will be understood to include the possibilities of "A" or "B" or "A and B."
In addition, where features or aspects of the disclosure are described in terms of Markush groups, those skilled in the art will recognize that the disclosure is also thereby described in terms of any individual member or subgroup of members of the Markush group.
While various aspects and embodiments have been disclosed herein, other aspects and embodiments will be apparent to those skilled in the art. The various aspects and embodiments disclosed herein are for purposes of illustration and are not intended to be limiting, with the true scope and spirit being indicated by the following claims.
Referral Numerals:
Reference Number Description
100 Acoustic vehicle alert system (AVAS)
101 Front radar
102 Front camera module
103 Left-hand side rear radar
104 Right-hand side rear radar
105 Rear-camera module
106 Ambient sensor
107 User select module
108 Microphone
109 Advanced driver assistance systems (ADAS) front controller
110 Advanced driver assistance systems (ADAS) rear controller
112 AVAS Control module
113, 120 Electric motors
114 Artificial Intelligence Module
115 Communication protocol
116 Communication protocol
117 Communication protocol
118 Vehicle speed sensor
119 Front speaker
121 Rear speaker
200 Flow-chart
201 -211 Method steps
We Claim:
1. An acoustic vehicle alert system (AVAS) (100) for a vehicle, the system (100)
comprising:
an ADAS front controller (109) configured for processing a first predetermined data, the first predetermined data corresponds to a cumulative data generated from a front radar (101) and a front camera module (102) mounted in at least a front portion of the vehicle;
an ADAS rear controller (110) configured for processing a second predetermined data, the second predetermined data corresponds to a cumulative data generated from a left-hand rear radar (103) and the right -hand rear radar (104) mounted to a left-hand side and right-hand side of the vehicle, respectively;
an infotainment control module (111) configured for processing a third predetermined data, the third predetermined data corresponds to a cumulative data generated from a plurality of modules including an ambient sensor (106), a user select module (107) and a microphone (108);
a vehicle speed sensor (118) configured to generate a predefined signal indicative of vehicle speed; and
an AVAS control module (112) communicatively coupled to the ADAS front controller (109), the ADAS rear controller (110), the infotainment control module (111), and the vehicle speed sensor (118), the AVAS control module (112) configured to execute the first, second and third predetermined data through an Artificial Intelligence (AI) module (114) to generate a predefined acoustic alert signal when the vehicle speed is lesser than a predetermined vehicle speed and each of the first, second and third predetermined data corresponds to a predetermined threshold.
2. The system (100) as claimed in claim 1, wherein the ambient sensor (106) is configured for generating a plurality of ambient signals corresponding to a plurality of vehicle surrounding conditions; and wherein the plurality of vehicle surrounding conditions refers to a daytime condition, night time condition.
3. The system (100) as claimed in claim 1, wherein the AI module (114) includes a pre-stored data points stored in a knowledge base therein; wherein the pre-stored data points corresponds to a plurality of data points indicative of alert signal generation.
4. The system (100) as claimed in claim 1, wherein the AI module (114) includes a deep learning algorithm and a Kalman filter; and wherein the AI module (114) is configured to perform data matching of the first, second and third predetermined data and the vehicle speed data with corresponding pre-stored data points in the AI module (114).
5. The system (100) as claimed in claim 1, wherein the predetermined threshold corresponds to a plurality of data values for the first, second and third predetermined data; and wherein the predefined acoustic alert signal is generated when the plurality of data values equals to a plurality of a pre-stored data points in the AI module (114) during execution of the first, second and third predetermined data by the AVAS control module (112).
6. The system (100) as claimed in claim 1, wherein the at least one of first, second and third predetermined data and the vehicle speed data generates an acoustic alert signal.
7. The system (100) as claimed in claim 1, wherein upon generation of the
acoustic alert signal, the AVAS control module (112) sends an actuation
signal to at least one of the front and rear speakers (119), (121); and wherein
the front speaker (113) is located on at least a front portion of the vehicle and the rear speaker (115) is located on at least a rear portion of the vehicle; and wherein the front and rear speakers (119), (121) are integrally built with electric motors (120), (113), respectively.
8. The system (100) as claimed in claim 1, wherein the electric motors (120), (113) is configured to drive the at least one of the front and rear speakers (119), (121), respectively.
9. The system (100) as claimed in claim 1, wherein the front radar (101) is mounted on at least a front portion of the vehicle and configured for detecting objects in a vehicle path to the front portion of the vehicle.
10. The system (100) as claimed in claim 1, wherein the front camera module (102) mounted in the at least a front portion of the vehicle and configured to capture images of the objects and pedestrians located to the front of the vehicle at a predetermined distance from the vehicle.
11. The system (100) as claimed in claim 1, wherein the left and right hand side rear radars (103), (104) are configured for capturing images of a rear region of the vehicle.
12. The system (100) as claimed in claim 1, wherein an ambient sensor (106) is configured for generating a plurality of ambient signals corresponding to a plurality of vehicle surrounding conditions.
13. The system (100) as claimed in claim 1, wherein upon generation of the acoustic alert signal, the AVAS control module (112) sends an actuation signal to at least one of the front and rear speakers (119), (121) mounted on the vehicle; and wherein the front speaker (113) is located on at least a front portion of the vehicle and the rear speaker (115) is located on at least a rear portion of the vehicle.
14. The system (100) as claimed in claim 1, wherein upon actuation, the at least one of the front and rear speakers (119), (121) are oriented to a predetermined angle in order to be directed towards the pedestrian; and wherein the at least one of the front and rear speakers generates an audio messages corresponding to the acoustic alert signal to provide real-time alert information to pedestrian.
15. The system (100) as claimed in claim 1, wherein the acoustic vehicle alert system (AVAS) includes a user interface such as user select module capable of enabling user settings and preferences; wherein the user select module is integrally provided with a sound profile logic, thereby allowing the user of the vehicle to select a predetermined sound profile mode from a plurality of sound profile options provided in the sound profile logic.
16. A method of operating the acoustic vehicle alert system (AVAS) (100) for
the vehicle, the acoustic vehicle alert system (AVAS) (100) comprising an
ADAS front controller (109) configured for processing a first predetermined
data, the first predetermined data corresponds to a cumulative data generated
from a front radar (101) and a front camera module (102) mounted in at least
a front portion of the vehicle; an ADAS rear controller (110) configured for
processing a second predetermined data, the second predetermined data
corresponds to a cumulative data generated from a left-hand rear radar (103)
and the right -hand rear radar (104) mounted to a left-hand side and right-hand
side of the vehicle, respectively; and an infotainment control module (111)
configured for processing a third predetermined data, the third predetermined
data corresponds to a cumulative data generated from a plurality of modules
including an ambient sensor (106), a user select module (107) and a
microphone (108); a vehicle speed sensor (118) configured to generate a
predefined signal indicative of vehicle speed, the method comprising:
executing the first, second and third predetermined data, by the AVAS control module (112) through an Artificial Intelligence (AI) module (114); and
generating a predefined acoustic alert signal when the vehicle speed is lesser than a predetermined vehicle speed and each of the first, second and third predetermined data corresponds to a predetermined threshold.
| # | Name | Date |
|---|---|---|
| 1 | 202221005198-STATEMENT OF UNDERTAKING (FORM 3) [31-01-2022(online)].pdf | 2022-01-31 |
| 2 | 202221005198-PROVISIONAL SPECIFICATION [31-01-2022(online)].pdf | 2022-01-31 |
| 3 | 202221005198-FORM 1 [31-01-2022(online)].pdf | 2022-01-31 |
| 4 | 202221005198-DRAWINGS [31-01-2022(online)].pdf | 2022-01-31 |
| 5 | 202221005198-FORM 3 [31-01-2023(online)].pdf | 2023-01-31 |
| 6 | 202221005198-ENDORSEMENT BY INVENTORS [31-01-2023(online)].pdf | 2023-01-31 |
| 7 | 202221005198-DRAWING [31-01-2023(online)].pdf | 2023-01-31 |
| 8 | 202221005198-CORRESPONDENCE-OTHERS [31-01-2023(online)].pdf | 2023-01-31 |
| 9 | 202221005198-COMPLETE SPECIFICATION [31-01-2023(online)].pdf | 2023-01-31 |
| 10 | Abstract1.jpg | 2023-02-14 |
| 11 | 202221005198-FORM 18 [12-04-2023(online)].pdf | 2023-04-12 |
| 12 | 202221005198-FORM-26 [03-07-2023(online)].pdf | 2023-07-03 |
| 13 | 202221005198-ORIGINAL U-R 6(1A) FORM 26-060723.pdf | 2023-09-12 |
| 14 | 202221005198-FER.pdf | 2024-12-16 |
| 15 | 202221005198-FORM 3 [13-03-2025(online)].pdf | 2025-03-13 |
| 16 | 202221005198-FER_SER_REPLY [16-06-2025(online)].pdf | 2025-06-16 |
| 17 | 202221005198-FORM-26 [24-06-2025(online)].pdf | 2025-06-24 |
| 18 | 202221005198-Proof of Right [07-10-2025(online)].pdf | 2025-10-07 |
| 1 | searchE_12-12-2024.pdf |
| 2 | 202221005198_SearchStrategyAmended_E_SearchHistory(13)AE_07-11-2025.pdf |