Abstract: The present disclosure relates to system(s) and method(s) for generating a personalized maintenance alert for a vehicle (103). The system received acoustic data associated with one or more segments in the vehicle (103), and vehicle data. Further, the system filters noise data from the acoustic data. Based on filtering the noise data, the system determines an acoustic pattern for the one or more segments. The system further compares the acoustic pattern, an ideal acoustic pattern and a historical acoustic pattern to identify a pattern change. Based on analysis of the pattern change and the vehicle data, the system predicts an issue associated with the one or more segments. Upon prediction of the issue, the system generates the personalized maintenance alert for one or more segments in the vehicle (103).
CROSS-REFERENCE TO RELATED APPLICATIONS AND PRIORITY
[001] The present application does not claim priority from any patent application.
TECHNICAL FIELD
[002] The present disclosure in general relates to the field of a maintenance of a vehicle. More particularly, the present invention relates to a system and method for generating a personalized maintenance alert for a vehicle.
BACKGROUND
[003] Generally, a vehicle maintenance is an important aspect for every vehicle owner. It is to be noted that a vehicle is serviced when a defined duration had been elapsed or the vehicle owner starts facing problem in the vehicle. However, there exits possibility of accidents due to sudden failure of any component in the vehicle. Further, some systems are available to diagnose a condition of components such as an engine, a tire, an air conditioner and the like. However, it requires multiple devices to be mounted at different components in the vehicle. In this case, malfunctioning of data received from different components may occur. Hence, the available systems may not be able to predict failure of any component in the vehicle.
SUMMARY
[004] Before the present systems and methods for generating a personalized maintenance alert for a vehicle, is described, it is to be understood that this application is not limited to the particular systems, and methodologies described, as there can be multiple possible embodiments which are not expressly illustrated in the present disclosure. It is also to be understood that the terminology used in the description is for the purpose of describing the particular versions or embodiments only, and is not intended to limit the scope of the present application. This summary is provided to introduce concepts related to systems and methods for generating the personalized maintenance alert for the vehicle. This summary is not intended to identify essential features of the claimed subject matter nor is it intended for use in determining or limiting the scope of the claimed subject matter.
[005] In one implementation, a system for generating a personalized maintenance alert for a vehicle is illustrated. The system comprises a memory and a processor coupled to the memory, further the processor is configured to execute instructions stored in the memory. The processor may execute instructions stored in the memory to receive acoustic data associated with one or more segments of a vehicle from one or more acoustic sensors mounted in the one or more segments, and vehicle data. The one or more segment may comprise an engine compartment, a vehicle inside cabin, and a vehicle back side. The vehicle data may comprise a vehicle driving condition, a vehicle speed, a vehicle issue, a vehicle age, and a manufacturer of the vehicle. The processor may further execute instructions stored in the memory to filter noise data from the acoustic data of the one or more segments. The noise data may be filtered based on applying a filtering technique on the acoustic data. Further, the processor may execute instructions stored in the memory to determine an acoustic pattern associated with the one or more segments using machine learning model upon filtration of the noise data. Furthermore, the processor may execute instructions stored in the memory to identify a pattern change based on comparison of the acoustic pattern, an ideal acoustic pattern and a historical acoustic pattern of the one or more segments. The processor may further execute instructions stored in the memory to develop and mature the machine learning module individually for the vehicle based on the acoustic data and corresponding user inputs using a collaborative learning technique. The processor may further execute instructions stored in the memory to predict an issue associated one or more segments based on the pattern change and the vehicle data. Further, the processor may execute instructions stored in the memory to generate the personalized maintenance alert for one or more segments of the vehicle based on the determination of the issue.
[006] In another implementation, a method for generating a personalized maintenance alert for a vehicle is illustrated. In one embodiment, the method may comprise receiving acoustic data associated with one or more segments of a vehicle from one or more acoustic sensors mounted in the one or more segments, and vehicle data. The one or more segment may comprise an engine compartment, a vehicle inside cabin, and a vehicle back side. The vehicle data may comprise a vehicle driving condition, a vehicle speed, a vehicle issue, a vehicle age, and a manufacturer of the vehicle. The method may further comprise filtering noise data from the acoustic data of the one or more segments. The noise data may be filtered based on applying a filtering technique on the acoustic data. Further, the method may comprise determining an acoustic pattern associated with the one or more segments using machine learning model upon filtration of the noise data. Furthermore, the method may comprise identifying a pattern change based on comparison of the acoustic pattern, an ideal acoustic pattern and a historical acoustic pattern of the one or more segments. The method may further comprise developing and maturing the machine learning module individually for the vehicle based on the acoustic data and corresponding user inputs using a collaborative learning technique. The method may further comprise predicting an issue associated one or more segments based on the pattern change and the vehicle data. Further, the method may comprise generating the personalized maintenance alert for one or more segments of the vehicle based on the determination of the issue.
BRIEF DESCRIPTION OF DRAWINGS
[007] The detailed description is described with reference to the accompanying figures. In the figures, the left-most digit(s) of a reference number identifies the figure in which the reference number first appears. The same numbers are used throughout the drawings to refer like features and components.
[008] Figure 1A illustrates a network implementation of a system for generating a personalized maintenance alert for a vehicle, in accordance with an embodiment of the present subject matter.
[009] Figure 1B illustrates an overview of the system for generating a personalized maintenance alert for a vehicle, in accordance with an embodiment of the present subject matter.
[0010] Figure 2 illustrates the system for generating the personalized maintenance alert for the vehicle, in accordance with an embodiment of the present subject matter.
[0011] Figure 3 illustrates a method for generating a personalized maintenance alert for a vehicle, in accordance with an embodiment of the present subject matter.
DETAILED DESCRIPTION
[0012] Some embodiments of the present disclosure, illustrating all its features, will now be discussed in detail. The words “receiving”, “filtering”, “determining”, “identifying”, “developing”, “maturing”, “predicting”, “generating”, and other forms thereof, are intended to be equivalent in meaning and be open ended in that an item or items following any one of these words is not meant to be an exhaustive listing of such item or items, or meant to be limited to only the listed item or items. It must also be noted that as used herein and in the appended claims, the singular forms "a", "an" and "the" include plural references unless the context clearly dictates otherwise. Although any systems and methods similar or equivalent to those described herein can be used in the practice or testing of embodiments of the present disclosure, the exemplary, systems and methods for generating a personalized maintenance alert for a vehicle are now described. The disclosed embodiments of the system and method for generating the personalized maintenance alert for the vehicle are merely exemplary of the disclosure, which may be embodied in various forms.
[0013] Various modifications to the embodiment will be readily apparent to those skilled in the art and the generic principles herein may be applied to other embodiments. However, one of ordinary skill in the art will readily recognize that the present disclosure for generating a personalized maintenance alert for a vehicle is not intended to be limited to the embodiments illustrated, but is to be accorded the widest scope consistent with the principles and features described herein.
[0014] The present subject matter relates to generating a personalized maintenance alert for a vehicle. In one embodiment, acoustic data associated with one or more segments of the vehicle is received. The acoustic data may be received from one or more acoustic sensors mounted in the one or more segments. Further, vehicle data may be received. The vehicle data may comprise a vehicle driving condition, a vehicle speed, a vehicle age, a manufacturer of the vehicle and the like. Once the acoustic data and the vehicle data is received, noise data may be filtered from the acoustic data of the one or more segments. Upon filtering the noise data, an acoustic pattern associated with the one or more segments may be determined using a machine learning model. Further, the acoustic pattern may be compared with an ideal acoustic pattern and a historical acoustic pattern of the one or more segments. Based on the comparison, a pattern change may be identified. The machine learning model may be further developed and matured individually for each vehicle based on the acoustic data and corresponding user inputs using a collaborative learning technique. Further, an issue associated with the one or more segments may be predicted based on the pattern change and the vehicle data. Upon prediction of the issue, a personalized maintenance alert may be generated for the vehicle.
[0015] Referring now to Figure 1A, a network implementation 100 of a system 102 for generating a personalized maintenance alert for a vehicle is disclosed. Referring to figure 1B, an overview of the system 102 for generating the personalized maintenance alert is illustrated. Further, the figure 1A and figure 1B are described together, in accordance with an embodiment of the present subject matter.
[0016] Although the present subject matter is explained considering that the system 102 is implemented on a server, it may be understood that the system 102 may also be implemented in a variety of computing systems, such as a laptop computer, a desktop computer, a notebook, a workstation, a mainframe computer, a server, a network server, and the like. The system 102 may be referred to as a backend platform. In one implementation, the system 102 may be implemented over a cloud network. Further, it will be understood that the system 102 may be accessed by multiple users through one or more user devices 104-1, 104-2…104-N, collectively referred to as user device 104 hereinafter, or applications residing on the user device 104. Examples of the user device 104 may include, but are not limited to, a portable computer, a personal digital assistant, a handheld device, and a workstation. The user device 104 may be communicatively coupled to the system 102 through a network 106.
[0017] In one implementation, the network 106 may be a wireless network, a wired network or a combination thereof. The network 106 may be implemented as one of the different types of networks, such as intranet, local area network (LAN), wide area network (WAN), the internet, and the like. The network 106 may either be a dedicated network or a shared network. The shared network represents an association of the different types of networks that use a variety of protocols, for example, Hypertext Transfer Protocol (HTTP), Transmission Control Protocol/Internet Protocol (TCP/IP), Wireless Application Protocol (WAP), and the like, to communicate with one another. Further, the network 106 may include a variety of network devices, including routers, bridges, servers, computing devices, storage devices, and the like.
[0018] In one implementation, the system 102 may be integrated with a vehicle setup. The vehicle setups may comprise one or more acoustic sensors and a telematics system of the vehicle. The system 102 may receive data associated with the one or more vehicles 103-A, 103- B…, 103-N, collectively referred as vehicle 103 hereinafter. One or more acoustic sensors 110-A, 110-B, 110-C, …110-N, collectively referred to as one or more acoustic sensors 110 hereinafter, may be mounted in one or more segments of the vehicle 103. The one or more segments may comprise an engine compartment, a vehicle inside cabin, a vehicle back side and the like. Further, the one or more segments may comprise one or more components. The one or more components may correspond to a vehicle tire, a vehicle brake, a vehicle engine, a vehicle transmission, a vehicle steering wheel and the like.
[0019] In one embodiment, the system 102 may receive acoustic data associated with the one or more segments of the vehicle 103. The acoustic data may be received from the one or more acoustic sensors 110 mounted in the one or more segments. Further, the system 102 may receive vehicle data. In one embodiment, the vehicle data may be received from the user device based on user inputs. In another embodiment, the vehicle data may be received from a telematics system, a GPS, and the like, of the vehicle 103. The vehicle data may comprise a vehicle driving condition, a vehicle speed, a vehicle issue, a vehicle age, a manufacturer of the vehicle and the like. In one aspect, at block 114, the acoustic data and the vehicle data may be received.
[0020] Once the acoustic data is received, the system 102 may filter noise data from the acoustic data of the one or more segments. The noise data may be filtered based on applying a filtering technique on the acoustic data. The filtering technique may comprise determining a set of frequencies in the acoustic data of the one or more segments. Further, a subset of frequencies from the set of frequencies may be identified. The subset of frequencies may be identified based on comparison of the set of frequencies. the subset of frequencies may indicate the noise data. In other words, the subset of frequencies may be common frequencies present in the acoustic data of the one or more segments. Further, the noise data may be filtered from the acoustic data of the one or more segments.
[0021] Upon filtration of the noise data, the system 102 may determine an acoustic pattern associated with the one or more segments may be determined. The acoustic pattern may be determined using a machine learning model. In one embodiment, the acoustic pattern may be determined based on elimination of the noise data from the acoustic data of the one or more segments. In one example, the machine learning model may be referred as a backend machine learning model.
[0022] Further, the system 102 may compare the acoustic pattern, an ideal acoustic pattern and a historical acoustic pattern of the one or more segments of the vehicle 103. Based on the comparison, the system 102 may identify a pattern change. The pattern change may correspond to deterioration of the acoustic pattern from the historical acoustic pattern and the ideal acoustic pattern.
[0023] Once the pattern change is identified, the system 102 may predict an issue associated with the one or more segments. The issue may be identified based on an analysis of the pattern change and the vehicle data. In one aspect, the system 102 may predict the issue based on analysing a vehicle condition, the vehicle issues and the like.
[0024] Upon prediction of the issue, the system 102 may generate a personalized maintenance notification for the one or more segments of the vehicle 103. The personalized maintenance alert may indicate the issue in the one or more segments. The system 102 may transmit the personalized maintenance alert to a device 105. Examples of the device 105 may include, but not limited to, a mobile phone, a laptop and the like.
[0025] The system 102 may further maintain a model repository to store data associated a set of machine learning models 112-A, 112-B…112-N, collectively referred to as the machine learning model 112. The machine learning model 112 may be associated with different vehicle profile.
[0026] In one aspect, the system 102 may learn and mature the backend machine learning models 112 for each vehicle 103 based on the acoustic data and matching user data input. It is to be noted that an individual model 112 may be created and learned over time for each vehicle 103 to enable personalized vehicle maintenance alert system.
[0027] In one embodiment of the present subject matter, construe the system 102 as a backend platform hereinafter. In one embodiment, the backend platform may be deployed over a cloud based server. In another embodiment, the backend platform may be deployed over a non-cloud based server. The backend platform may use data algorithms and machine learning based models 112 to perform data analytics. The backend platform may consist of a hardware apparatus & a software apparatus. The hardware apparatus and the software apparatus may be equipped with the acoustic sensor 110. Further, communication modules leverage a Bluetooth interface or a Wi-Fi interface with the vehicle 103 along with an embedded software build on top to enable and execute the core functionality of the hardware apparatus or the software apparatus. In one aspect, the apparatus may be an external device like an aftermarket device that can be fitted on a car and enable its monitoring and analysis.
[0028] Further, the backend platform may receive vehicle data from a GPS and an accelerometer mounted in the vehicle 103. The vehicle data may comprise a running condition of the vehicle such as idle, running, speed and the like. In one aspect, the running condition may be coupled to acoustic data while sending the running vehicle condition to the system 102. In one aspect, it may enable a complete non-intrusive mode of operation.
[0029] Further, the apparatus coupled to the backend platform may pre-process the acoustic data. The acoustic data may comprise raw data. The raw data may correspond to inconsistencies and noise due to the presence of contaminants in the surroundings. In order to reduce the effect of noise and outliers, the acoustic data may be pre-processed. The pre-processing may be software based. Further, multiple mechanisms such as filtering, clipping, smoothing & normalization may be used for the pre-processing. The apparatus may further upload the pre-processed data along with the running condition of the vehicle 103 to the backend platform. In one aspect, the acoustic data coupled with the running condition of the vehicle, and an identity of the vehicle may be received in real time and aggregated into a relevant database on the platform.
[0030] Further, the backend model may process the acoustic data using a machine learning algorithms in a continuous mode to monitor and determine any change in the pattern for a driving condition. In addition to the acoustic data, the backend platform may receive user inputs indicating the vehicle condition. In one aspect, the user input may comprise a vehicle status, issues and events from the user via a smart phone 105 over a UI based Application. Further, the machine learning model 112 of the backend platform may determine the acoustic data pattern to the vehicle condition and start learning. Over a period of time, the machine learning model 112 may be learned to determine the different acoustic pattern of vehicle to corresponding condition and issue with it.
[0031] In one example, if a tire pressure is low, then a particular acoustic pattern may be associated with the tire pressure. In another example, if engine oil filter is not working good, then that may have its own specific acoustics data pattern.
[0032] The machine learning models 112 may associate the processing of acoustic data with the vehicle 103. Since the acoustic pattern of one vehicle might not be exactly same to another vehicle. This is due to the fact that every vehicle has its own make, model, driven condition, and how being driven and age. It may enable the machine learning model 112 to generate personalized alerts to the user based on learning built with that specific vehicle acoustic patterns on different conditions and issues. In one aspect, the alerts may be issued to the user smart phone 105 via a UI based app. In another aspect, mechanisms such as SMS may be used to inform the user regarding the issue.
[0033] Further, the backend platform may make a standard repository for a specific vehicle 103 based on its base data. It may be done based on learning builds for different vehicles. In one example, construe a Honda city car which is 5 years old – learning is built. Further, if another user registers with his car on the platform which is having same make, model and similar age. Then the backend platform may take the models already learned as a base. And then start learning for this new vehicle on top of that. Thus, fine tuning or customizing a base model to the new vehicle acoustic pattern. In this case, new vehicle model will be learned in a shorter time than the one that was learned from scratch.
[0034] Referring now to figure 2, the system 102 for generating a personalized maintenance alert for a vehicle 103 is illustrated in accordance with an embodiment of the present subject matter. In one embodiment, the system 102 may include at least one processor 202, an input/output (I/O) interface 204, and a memory 206. The at least one processor 202 may be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, state machines, logic circuitries, and/or any devices that manipulate signals based on operational instructions. Among other capabilities, at least one processor 202 may be configured to fetch and execute computer-readable instructions stored in the memory 206.
[0035] The I/O interface 204 may include a variety of software and hardware interfaces, for example, a web interface, a graphical user interface, and the like. The I/O interface 204 may allow the system 102 to interact with the user directly or through the user device 104. Further, the I/O interface 204 may enable the system 102 to communicate with other computing devices, such as web servers and external data servers (not shown). The I/O interface 204 may facilitate multiple communications within a wide variety of networks and protocol types, including wired networks, for example, LAN, cable, etc., and wireless networks, such as WLAN, cellular, or satellite. The I/O interface 204 may include one or more ports for connecting a number of devices to one another or to another server.
[0036] The memory 206 may include any computer-readable medium known in the art including, for example, volatile memory, such as static random access memory (SRAM) and dynamic random access memory (DRAM), and/or non-volatile memory, such as read only memory (ROM), erasable programmable ROM, flash memories, hard disks, optical disks, and magnetic tapes. The memory 206 may include modules 208 and data 210.
[0037] The modules 208 may include routines, programs, objects, components, data structures, and the like, which perform particular tasks, functions or implement particular abstract data types. In one implementation, the module 208 may include data receiving module 212, a filtering module 214, a pattern determination module 216, an identification module 218, a development module 219, an issue prediction module 220, and generation module 222, and other modules 224. The other modules 224 may include programs or coded instructions that supplement applications and functions of the system 102.
[0038] The data 210, amongst other things, serve as a repository for storing data processed, received, and generated by one or more of the modules 208. The data 210 may also include a repository 226, and other data 228. In one embodiment, the other data 228 may include data generated as a result of the execution of one or more modules in the other modules 228.
[0039] In one implementation, a user may access the system 102 via the I/O interface 204. The user may be registered using the I/O interface 204 in order to use the system 102. In one aspect, the user may access the I/O interface 204 of the system 102 for obtaining information, providing input information or configuring the system 102.
[0040] In another implementation, the user may not be able to directly access the system 102. The user may access the system 102 via an application in a device 105. Examples of the device may include, but not limited to, a mobile phone, a laptop and the like.
[0041] In one embodiment, the data receiving module 212 may receive acoustic data associated with one or more segments of the vehicle. The acoustic data may be received from one or more acoustic sensors 110 mounted in the one or more segments. The acoustic data may be received in real-time. The acoustic data may be received via a wired communication or a wireless communication. In one aspect, the acoustic data may be stored in the repository 226.
[0042] The one or more segments may comprise an engine compartment, a vehicle inside cabin, a vehicle back side and the like. Further, one or more components may be associated with the one or more segments. The one or more components may comprise an engine, a braking system, a transmission system, a vehicle tire, a wheel alignment, an emission system, an Air Conditioning (AC) system, and the like.
[0043] Further, the data receiving module 212 may receive vehicle data. In one aspect, the vehicle data may be fetched from a telematics unit. In another embodiment, the vehicle data may be received based from a device based on user inputs. The vehicle data may comprise a vehicle driving condition, a vehicle ID, a vehicle speed, a vehicle age, a manufacturer of vehicle and the like. In one aspect, driver data may be received. The driver data may comprise a driver name, age, health issues and the like.
[0044] Once the acoustic data is received, the filtering module 214 may filter noise data from the acoustic data. The noise data is filtered based on applying a filtering technique on the acoustic data. In one embodiment, the filtering technique may comprise determining a set of frequencies in the acoustic data of the one or more segments. The set of frequencies may be determined based on an analysis of the acoustic data. Once the set of frequencies is determined, the filtering technique may comprise comparing the set of frequencies associated with the one or more segments. Based on the comparison, the filtering technique may comprise identifying a subset of frequencies from the set of frequencies. The subset of frequencies may indicate the noise data. In other words, the subset of frequencies may be common frequencies in the acoustic data. Further, the filtering module 214 may filter the noise data from the acoustic data of the one or more segments.
[0045] Upon filtering the noise data, the pattern determination module 216 may determine an acoustic pattern of the one or more segments. The acoustic pattern may be determined using a machine learning model 112. The acoustic data may be analysed using the machine learning model 112 to determine the acoustic pattern. In one embodiment, the analysis of the acoustic data by the machine learning model 112 may be performed over a backend platform.
[0046] In one embodiment, the machine learning model 112 may continuously analyse the acoustic data from the one or more segments. The acoustic data may be analysed in real-time. Based on the analysis, the machine learning model 112 may determine the acoustic pattern associated with the one or more segments. The machine learning model 112 may use a collaborative learning technique to determine the acoustic pattern. In other words, the machine learning model 112 may learn the acoustic pattern associated with the one or more segments.
[0047] Once the acoustic pattern is determined, the identification module 218 may identify a pattern change. The pattern change may be identified based on comparison of the acoustic pattern, an ideal acoustic pattern, and a historical acoustic pattern of the one or more segments. The pattern change may indicate deterioration of the acoustic pattern from the ideal acoustic pattern and the historical acoustic pattern.
[0048] In one embodiment, the acoustic pattern may be identical to the ideal acoustic pattern. In another embodiment, the acoustic pattern may be identical to the historical acoustic pattern. In yet another embodiment, the acoustic pattern may be less deteriorated from the historical acoustic pattern and more deteriorated from the ideal acoustic pattern. In yet another embodiment, the acoustic pattern may be less deteriorated from the ideal acoustic pattern and more deteriorated from the historical acoustic pattern.
[0049] In one embodiment, the repository 226 may store the ideal acoustic pattern, the historical acoustic pattern, a segment name, the one or more components and the like. The ideal acoustic pattern may indicate good condition of the one or more segments. The historical acoustic pattern may be a previous acoustic pattern of the one or more segments.
[0050] Further, the development module 219 may mature and develop the machine learning model 112 individually for the vehicle 103 based on the acoustic data and corresponding user inputs. The machine learning model 112 may be matured and developed using a collaborative learning technique. The user inputs may comprise a vehicle status, a vehicle issues, faults or events received via a smart phone 105 over a UI based application.
[0051] Upon identifying the pattern change, the issue prediction module 220 may determine an issue associated with the one or more segments. The issue may be predicted based on an analysis of the pattern change and the corresponding user inputs. In one aspect, the issue may be predicted based on the deterioration of the acoustic pattern from ideal acoustic pattern and the historical acoustic pattern. The issue may correspond to a failure of the one or more components present in the one or more segments.
[0052] Once the issue is predicted, the generation module 222 may generate one or more maintenance alerts for the vehicle 103. The one or more maintenance alerts may be associated with the one or more segments. The maintenance alert may indicate prediction of the issue associated with the one or more segments. The one or more maintenance alerts may be transmitted to a user device. The user device may comprise a mobile phone, a display and the like. In one aspect, the generation module 218 may store the one or more maintenance alerts in the repository 226. In one aspect, the maintenance alerts may be personalized for the vehicle 103. In one aspect, a personalized maintenance alert may be generated based on the vehicle data.
[0053] In one embodiment, the generation module 222 may generate a model repository. The model repository may store a set of machine learning models associated with a set of vehicles. Each machine learning model 112 is matured to generate the maintenance alert using a collaborative learning. The set of machine learning models may comprise the manufacturer of vehicle, the issue of vehicle, the maintenance alert, the acoustic pattern of the one or more segments, user inputs, and the like. In one aspect, different machine learning model may be developed for different vehicles. The machine learning model 112 developed for a particular vehicle 103 may be referred as a personalized model. In another aspect, the machine learning model 112 developed for one vehicle may be deployed in other vehicle.
[0054] In one embodiment, a standard repository for a specific vehicle 103 may be generated based on its base data. It may be done based on learning builds for different vehicles. In one example, construe a Honda city car which is 5 years old – learning is built. Further, if another user registers with his car on the platform which is having same make, model and similar age. In this case, the backend platform may take the models already learned as a base. Further, the backend platform may start learning for this new vehicle on top of that. Thus, fine tuning or customizing a base model to the new vehicle acoustic pattern. In this case, new vehicle model will be learned in a shorter time than the one that was learned from scratch.
[0055] In one exemplary embodiment, construe an acoustic sensor 110 mounted at an engine system, a braking system and tires of a vehicle 103. The vehicle 103 may be running on a road. In this case, vehicle data comprising vehicle speed and vehicle running condition may be received. Further, engine acoustic data may be received from the acoustic sensor 110 mounted at the engine system. Furthermore, brake acoustic data may be received from the acoustic sensor 110 mounted at the braking system. Furthermore, tire acoustic data may be received from the acoustic sensor 110 mounted at the tires.
[0056] Once the engine acoustic data, the brake acoustic data and the tire acoustic data is received, common frequencies present in the engine acoustic data, the brake acoustic data and the tire acoustic data may be determined. the Common frequencies may be determined based on an analysis of the engine acoustic data, the brake acoustic data and the tire acoustic data. The common frequencies may indicate noise data. Further, the noise data may be filtered from the engine acoustic data, the brake acoustic data and the tire acoustic data. Upon filtering the noise data, an engine acoustic pattern, a brake acoustic pattern and a tire acoustic pattern may be identified. The engine acoustic pattern may be identified based on filtering the noise data from the engine acoustic data.
[0057] Further, the engine acoustic pattern may be compared with an ideal engine pattern and an historical engine pattern. The brake acoustic pattern may be compared with an ideal brake pattern and an historical brake pattern. The tire acoustic pattern may be compared with an ideal tire pattern and an historical tire pattern. Based on the comparison, a pattern change may be determined. Once the pattern change is determined, an issue associated with one of the engine system, the braking system or the tires may be predicted. Upon predicting the issue, an alert may be generated and transmitted to a mobile phone of a driver. Based on the alert, the driver may plan maintenance of the vehicle.
[0058] Exemplary embodiments discussed above may provide certain advantages. Though not required to practice aspects of the disclosure, these advantages may include those provided by the following features.
[0059] Some embodiments of the system and the method is configured to generated a personalized maintenance alert a vehicle.
[0060] Some embodiments of the system and the method is configured to mature a backend model deployed in a vehicle to generate a personalized maintenance alert.
[0061] Some embodiments of the system and the method is configured to maintain a standard model repository to store data associated with different vehicles.
[0062] Referring now to figure 3, a method 300 for generating a personalized maintenance alert for a vehicle 103, is disclosed in accordance with an embodiment of the present subject matter. The method 300 may be described in the general context of computer executable instructions. Generally, computer executable instructions can include routines, programs, objects, components, data structures, procedures, modules, functions, and the like, that perform particular functions or implement particular abstract data types. The method 300 may also be practiced in a distributed computing environment where functions are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, computer executable instructions may be located in both local and remote computer storage media, including memory storage devices.
[0063] The order in which the method 300 is described is not intended to be construed as a limitation, and any number of the described method blocks can be combined in any order to implement the method 300 or alternate methods. Additionally, individual blocks may be deleted from the method 300 without departing from the spirit and scope of the subject matter described herein. Furthermore, the method 300 can be implemented in any suitable hardware, software, firmware, or combination thereof. However, for ease of explanation, in the embodiments described below, the method 300 may be considered to be implemented in the above described system 102.
[0064] At block 302, acoustic data and vehicle data may be received. In one implementation, the data receiving module 212 may receive the acoustic data associated with one or more segments of a vehicle 103, and the vehicle data. The acoustic data may be received from one or more acoustic sensors 110 mounted in the one or more segments. The one or more segments may comprise an engine compartment, a vehicle inside cabin, a vehicle back side, and the like. Further, the vehicle data may comprise a vehicle driving condition, a vehicle speed, a vehicle issue, a vehicle age, a manufacturer of the vehicle and the like.
[0065] At block 304, noise data may be filtered from the acoustic data. In one implementation, the filtering module 214 may filter the noise data from the acoustic data of the one or more segments. In one embodiment, the noise data may be filtered based on applying a filtering technique on the acoustic data.
[0066] At block 306, an acoustic pattern associated with the one or more segments may be determined. In one implementation, the pattern determination module 216 may determine the acoustic pattern associated. The acoustic pattern may be determined using a machine learning model upon filtration of the noise data.
[0067] At block 308, a pattern change may be identified. In one implementation, the identification module 218 may identify the pattern change. The pattern change may be identified based on comparison of the acoustic pattern, an ideal acoustic pattern and a historical acoustic pattern of the one or more segments.
[0068] At block 310, the machine learning model 112 may be developed and matured. In one implementation, the development module 219 may develop and mature the machine learning model 112 based on the acoustic data and corresponding user inputs. The machine learning model 112 may be developed and matured using a collaborative learning technique.
[0069] At block 312, an issue associated with the one or more segments may be predicted. In one implementation, the issue prediction module 220 may predict the issue. The issue may be predicted based on an analysis of the pattern change and the vehicle data.
[0070] At block 314, a personalized maintenance alert for the vehicle 103 may be generated. In one implementation, the generation module 222 may generate the personalized maintenance alert for one or more segments of the vehicle 103. The personalized alert may be generated based the prediction of the issue.
[0071] Although implementations for systems and methods for generating a personalized maintenance alert for a vehicle have been described, it is to be understood that the appended claims are not necessarily limited to the specific features or methods described. Rather, the specific features and methods are disclosed as examples of implementations for generating the personalized maintenance alert for the vehicle
Claims:
1. A system (102) for generating a personalized maintenance alert for a vehicle (103), the system comprising:
a memory;
a processor coupled to the memory, wherein the processor is configured to execute instructions stored in the memory to:
receive acoustic data associated with one or more segments of a vehicle (103) from one or more acoustic sensors (110) mounted in the one or more segments, and vehicle data, wherein the one or more segments comprise an engine compartment, a vehicle inside cabin, and a vehicle back side, and wherein the vehicle data comprises a vehicle driving condition, a vehicle speed, a vehicle issue, a vehicle age, and a manufacturer of the vehicle;
filter noise data from the acoustic data of the one or more segments, wherein the noise data is filtered based on applying a filtering technique on the acoustic data;
determine an acoustic pattern associated with the one or more segments using a machine learning model (112) upon filtration of the noise data;
identify a pattern change based on comparison of the acoustic pattern, an ideal acoustic pattern and a historical acoustic pattern of the one or more segments;
develop and mature the machine learning model (112) individually for the vehicle (103) based on the acoustic data and corresponding user inputs using a collaborative learning technique;
predict an issue associated one or more segments based on the pattern change and the vehicle data; and
generate a personalized maintenance alert for one or more segments of the vehicle (103) based on the prediction of the issue.
2. The system (102) as claimed in claim 1, wherein the filtering technique comprises:
determining a set of frequencies associated with the acoustic data of the one or more segments;
identifying a subset of frequencies from the set of frequencies based on comparison of the set of frequencies associated with the one or more segments; and
filtering the subset of frequencies from the acoustic data of the one or more segments, wherein the subset of frequencies indicates the noise data;
3. The system (102) as claim in claim 1, wherein the user inputs comprise a vehicle status, a vehicle issues, faults or events received via a smart phone over a UI based Application.
4. The system (102) as claimed in claim 1, further configured to generate a model repository, wherein the model repository stores a set of machine learning models associated with different vehicle profiles.
5. A method for generating a personalized maintenance alert for a vehicle (103), the method comprises:
receiving, by a processor, acoustic data associated with one or more segments of a vehicle (103) from one or more acoustic sensors (110) mounted in the one or more segments, and vehicle data, wherein the one or more segments comprise an engine compartment, a vehicle inside cabin, and a vehicle back side, and wherein the vehicle data comprises a vehicle driving condition, a vehicle speed, a vehicle issue, a vehicle age, and a manufacturer of the vehicle;
filtering, by the processor, noise data from the acoustic data of the one or more segments, wherein the noise data is filtered based on applying a filtering technique on the acoustic data;
determining, by the processor, an acoustic pattern associated with the one or more segments using a machine learning model (112) upon filtration of the noise data;
identifying, by the processor, a pattern change based on comparison of the acoustic pattern, an ideal acoustic pattern and a historical acoustic pattern of the one or more segments;
developing and maturing the machine learning model (112) individually for the vehicle (103) based on the acoustic data and corresponding user inputs using a collaborative learning technique;
predicting, by the processor, an issue associated one or more segments based on the pattern change and the vehicle data; and
generating, by the processor, a personalized maintenance alert for one or more segments of the vehicle (103) based on the prediction of the issue.
6. The method as claimed in claim 5, wherein the filtering technique comprises:
determining a set of frequencies associated with the acoustic data of the one or more segments;
identifying a subset of frequencies from the set of frequencies based on comparison of the set of frequencies associated with the one or more segments; and
filtering the subset of frequencies from the acoustic data of the one or more segments, wherein the subset of frequencies indicates the noise data;
7. The method as claim in claim 5, wherein the user inputs comprise a vehicle status, a vehicle issues, faults or events received via a smart phone over a UI based Application
8. The method as claimed in claim 5, further comprises generating a model repository, wherein the model repository stores a set of machine learning models associated with different vehicle profiles.
| Section | Controller | Decision Date |
|---|---|---|
| # | Name | Date |
|---|---|---|
| 1 | 201911009040-IntimationOfGrant19-01-2024.pdf | 2024-01-19 |
| 1 | 201911009040-STATEMENT OF UNDERTAKING (FORM 3) [08-03-2019(online)].pdf | 2019-03-08 |
| 2 | 201911009040-PatentCertificate19-01-2024.pdf | 2024-01-19 |
| 2 | 201911009040-REQUEST FOR EXAMINATION (FORM-18) [08-03-2019(online)].pdf | 2019-03-08 |
| 3 | 201911009040-REQUEST FOR EARLY PUBLICATION(FORM-9) [08-03-2019(online)].pdf | 2019-03-08 |
| 3 | 201911009040-FORM 13 [27-12-2023(online)].pdf | 2023-12-27 |
| 4 | 201911009040-POWER OF AUTHORITY [08-03-2019(online)].pdf | 2019-03-08 |
| 4 | 201911009040-POA [27-12-2023(online)].pdf | 2023-12-27 |
| 5 | 201911009040-RELEVANT DOCUMENTS [27-12-2023(online)].pdf | 2023-12-27 |
| 5 | 201911009040-FORM-9 [08-03-2019(online)].pdf | 2019-03-08 |
| 6 | 201911009040-Written submissions and relevant documents [27-12-2023(online)].pdf | 2023-12-27 |
| 6 | 201911009040-FORM 18 [08-03-2019(online)].pdf | 2019-03-08 |
| 7 | 201911009040-FORM-26 [07-12-2023(online)].pdf | 2023-12-07 |
| 7 | 201911009040-FORM 1 [08-03-2019(online)].pdf | 2019-03-08 |
| 8 | 201911009040-FIGURE OF ABSTRACT [08-03-2019(online)].jpg | 2019-03-08 |
| 8 | 201911009040-Correspondence to notify the Controller [04-12-2023(online)].pdf | 2023-12-04 |
| 9 | 201911009040-DRAWINGS [08-03-2019(online)].pdf | 2019-03-08 |
| 9 | 201911009040-US(14)-HearingNotice-(HearingDate-13-12-2023).pdf | 2023-11-21 |
| 10 | 201911009040-COMPLETE SPECIFICATION [08-03-2019(online)].pdf | 2019-03-08 |
| 10 | 201911009040-FER.pdf | 2021-10-18 |
| 11 | 201911009040-Proof of Right [13-10-2021(online)].pdf | 2021-10-13 |
| 11 | abstract.jpg | 2019-04-11 |
| 12 | 201911009040-FORM 13 [09-07-2021(online)].pdf | 2021-07-09 |
| 12 | 201911009040-Proof of Right (MANDATORY) [05-09-2019(online)].pdf | 2019-09-05 |
| 13 | 201911009040-OTHERS-120919.pdf | 2019-09-13 |
| 13 | 201911009040-POA [09-07-2021(online)].pdf | 2021-07-09 |
| 14 | 201911009040-CLAIMS [04-06-2021(online)].pdf | 2021-06-04 |
| 14 | 201911009040-Correspondence-120919.pdf | 2019-09-13 |
| 15 | 201911009040-COMPLETE SPECIFICATION [04-06-2021(online)].pdf | 2021-06-04 |
| 15 | 201911009040-OTHERS [04-06-2021(online)].pdf | 2021-06-04 |
| 16 | 201911009040-FER_SER_REPLY [04-06-2021(online)].pdf | 2021-06-04 |
| 17 | 201911009040-OTHERS [04-06-2021(online)].pdf | 2021-06-04 |
| 17 | 201911009040-COMPLETE SPECIFICATION [04-06-2021(online)].pdf | 2021-06-04 |
| 18 | 201911009040-Correspondence-120919.pdf | 2019-09-13 |
| 18 | 201911009040-CLAIMS [04-06-2021(online)].pdf | 2021-06-04 |
| 19 | 201911009040-OTHERS-120919.pdf | 2019-09-13 |
| 19 | 201911009040-POA [09-07-2021(online)].pdf | 2021-07-09 |
| 20 | 201911009040-FORM 13 [09-07-2021(online)].pdf | 2021-07-09 |
| 20 | 201911009040-Proof of Right (MANDATORY) [05-09-2019(online)].pdf | 2019-09-05 |
| 21 | 201911009040-Proof of Right [13-10-2021(online)].pdf | 2021-10-13 |
| 21 | abstract.jpg | 2019-04-11 |
| 22 | 201911009040-COMPLETE SPECIFICATION [08-03-2019(online)].pdf | 2019-03-08 |
| 22 | 201911009040-FER.pdf | 2021-10-18 |
| 23 | 201911009040-DRAWINGS [08-03-2019(online)].pdf | 2019-03-08 |
| 23 | 201911009040-US(14)-HearingNotice-(HearingDate-13-12-2023).pdf | 2023-11-21 |
| 24 | 201911009040-FIGURE OF ABSTRACT [08-03-2019(online)].jpg | 2019-03-08 |
| 24 | 201911009040-Correspondence to notify the Controller [04-12-2023(online)].pdf | 2023-12-04 |
| 25 | 201911009040-FORM-26 [07-12-2023(online)].pdf | 2023-12-07 |
| 25 | 201911009040-FORM 1 [08-03-2019(online)].pdf | 2019-03-08 |
| 26 | 201911009040-Written submissions and relevant documents [27-12-2023(online)].pdf | 2023-12-27 |
| 26 | 201911009040-FORM 18 [08-03-2019(online)].pdf | 2019-03-08 |
| 27 | 201911009040-RELEVANT DOCUMENTS [27-12-2023(online)].pdf | 2023-12-27 |
| 27 | 201911009040-FORM-9 [08-03-2019(online)].pdf | 2019-03-08 |
| 28 | 201911009040-POWER OF AUTHORITY [08-03-2019(online)].pdf | 2019-03-08 |
| 28 | 201911009040-POA [27-12-2023(online)].pdf | 2023-12-27 |
| 29 | 201911009040-REQUEST FOR EARLY PUBLICATION(FORM-9) [08-03-2019(online)].pdf | 2019-03-08 |
| 29 | 201911009040-FORM 13 [27-12-2023(online)].pdf | 2023-12-27 |
| 30 | 201911009040-REQUEST FOR EXAMINATION (FORM-18) [08-03-2019(online)].pdf | 2019-03-08 |
| 30 | 201911009040-PatentCertificate19-01-2024.pdf | 2024-01-19 |
| 31 | 201911009040-IntimationOfGrant19-01-2024.pdf | 2024-01-19 |
| 31 | 201911009040-STATEMENT OF UNDERTAKING (FORM 3) [08-03-2019(online)].pdf | 2019-03-08 |
| 1 | 201911009040strategyE_22-02-2021.pdf |