Abstract: METHOD AND SYSTEM FOR TRANSMITTING VIDEO DATA ASSOCIATED WITH A VEHICLE OPERATOR ABSTRACT A method and system for transmitting a video data associated with a vehicle operator of a vehicle is disclosed. The method includes monitoring (302) an activity of the vehicle operator in real-time during an active state of the vehicle; detecting (304) an anomaly in a state of the vehicle operator based on a set of parameters; upon detecting the anomaly (306), sending (308) an anomaly detection alert to a Telematic Control Unit (TCU) (112); and capturing (310) the video data corresponding to the anomaly for a predefined time interval. The method further includes sending (312) a data transfer request to the TCU (112) via the CAN to initiate transfer of the video data; receiving (314) an acceptance response from the TCU (112) based on one or more network parameters; and transmitting (316) the video data to the TCU (112) via a wireless communication network upon receiving the acceptance response. [To be published with FIG. 3]
Description:DESCRIPTION
TECHNICAL FIELD
[001] This disclosure relates generally to data transmission, and more particularly to a method and a system for transmitting a video data associated with a vehicle operator.
BACKGROUND
[002] Many commercial vehicles use surveillance systems to ensure safety and facilitate a seamless driving experience. These surveillance systems often have cameras that continuously monitor activities of vehicle operators (i.e., a driver) and surrounding environment to alert him about any potential hazard or deviations from safe driving practices. Monitoring activities of the driver while driving is important to ensure well-being of the driver as any impairment or distraction may be hazardous not only for the driver, but also for passengers and other road users. For instance, if the driver gets distracted, like by using a smartphone, their capacity to make sound decisions may be compromised, potentially resulting in delayed responses and difficulty in maintaining focus while driving. This compromised attentional focus can increase likelihood of road accidents. In particular, most road accident occurs when the driver’s response time is too slow, or they fail to see relevant upcoming hazards. In these conditions, these surveillance systems act like a buddy that warns the driver by providing timely alerts to the driver about their own driving behaviour and status.
[003] However, these existing surveillance systems face prominent challenges, especially with rapid technological advancements. The currently used surveillance systems lack an effective method for monitoring, recording, and promptly alerting about the driver's activities. Moreover, the currently used surveillance systems present a significant challenge in storing video recordings captured for scenarios of driver impairment or distraction at large. In addition, retrieval of a specific segment of a video recording from vast amounts of recorded data efficiently and reliably with the usage of the currently used surveillance systems is often challenging. Further, considering the increase in a number of road accidents, it becomes crucial to have a detailed record (e.g., a video recorded data) of the accidents and the driver's actions before the accident. This record can be useful for the driver in legal situations, serving as evidence when making insurance claims or defending against accusations. Unfortunately, the currently used surveillance systems falls short in simultaneously managing tasks like, real-time monitoring, maintaining, and transmission of a recorded video data. To address these challenges, there is a need for enhanced technology that addresses above challenges to enhance driver safety and legal protection.
[004] Therefore, there is a requirement for an efficient and reliable method and system for transmitting video data associated with vehicle operators.
SUMMARY
[005] In one embodiment, a method of transmitting a video data associated with a vehicle operator of a vehicle is disclosed. The method may include monitoring, via a camera, an activity of the vehicle operator in real-time during an active state of the vehicle. The method may further include detecting an anomaly in a state of the vehicle operator based on a set of parameters in response to monitoring the activity. Upon detecting the anomaly, the method may include sending an anomaly detection alert to a Telematic Control Unit (TCU) over a Controller Area Network (CAN), and capturing, via the camera, the video data corresponding to the anomaly for a predefined time interval. The method may further includes sending a data transfer request to the TCU via the CAN to initiate transfer of the video data. The data transfer request may be sent periodically at a pre-defined time interval. The method may further include receiving an acceptance response from the TCU based on one or more network parameters in response to sending the data transfer request. The method may further include transmitting the video data to the TCU via a wireless communication network upon receiving the acceptance response.
[006] In another embodiment, a system of transmitting a video data associated with a vehicle operator of a vehicle is disclosed. The system may include a processor and a memory communicatively coupled to the processor. The memory may store processor-executable instructions, which when executed by the processor may cause the processor to monitor an activity of the vehicle operator in real-time during an active state of the vehicle. The processor-executable instructions, on execution, may further cause the processor to detect an anomaly in a state of the vehicle operator based on a set of parameters in response to monitoring the activity. Upon detecting the anomaly, the processor-executable instructions, on execution, may further cause the processor to send an anomaly detection alert to a Telematic Control Unit (TCU) over a Controller Area Network (CAN), and capture, via the camera, the video data corresponding to the anomaly for a predefined time interval. The processor-executable instructions, on execution, may further cause the processor to send a data transfer request to the TCU via the CAN to initiate transfer of the video data. The data transfer request may be sent periodically at a pre-defined time interval. The processor-executable instructions, on execution, may further cause the processor to receive an acceptance response from the TCU based on one or more network parameters in response to sending the data transfer request. The processor-executable instructions, on execution, may further cause the processor to transmit the video data to the TCU via a wireless communication network upon receiving the acceptance response.
[007] Various objects, features, aspects, and advantages of the inventive subject matter will become more apparent from the following detailed description of preferred embodiments, along with the accompanying drawing figures in which like numerals represent like components.
BRIEF DESCRIPTION OF THE DRAWINGS
[008] The accompanying drawings, which are incorporated in and constitute a part of this disclosure, illustrate exemplary embodiments and, together with the description, serve to explain the disclosed principles.
[009] FIG.1 illustrates a block diagram of an exemplary system for transmitting a video data associated with a vehicle operator of a vehicle, in accordance with some embodiments of the present disclosure.
[010] FIG.2 illustrates a functional block diagram of various modules within a memory of a driver monitoring system configured for transmitting a video data associated with a vehicle operator of a vehicle, in accordance with some embodiments of the present disclosure.
[011] FIG. 3 illustrates a flow diagram of a method of transmitting a video data associated with a vehicle operator of a vehicle, in accordance with some embodiments of the present disclosure.
[012] FIGS. 4A and 4B illustrate a flow diagram of a detailed exemplary process of transmitting a video data associated with a vehicle operator of a vehicle, in accordance with some embodiments of the present disclosure.
[013] FIGS. 5A - 5C illustrate an exemplary flow diagram depicting a flow of communication between a DMS and a TCU, in accordance with some embodiments of the present disclosure.
DETAILED DESCRIPTION
[014] Exemplary embodiments are described with reference to the accompanying drawings. Wherever convenient, the same reference numbers are used throughout the drawings to refer to the same or like parts. While examples and features of disclosed principles are described herein, modifications, adaptations, and other implementations are possible without departing from the scope of the disclosed embodiments. It is intended that the following detailed description be considered exemplary only, with the true scope being indicated by the following claims. Additional illustrative embodiments are listed.
[015] Further, the phrases “in some embodiments”, “in accordance with some embodiments”, “in the embodiments shown”, “in other embodiments”, and the like mean a particular feature, structure, or characteristic following the phrase is included in at least one embodiment of the present disclosure and may be included in more than one embodiment. In addition, such phrases do not necessarily refer to the same embodiments or different embodiments. It is intended that the following detailed description be considered exemplary only, with the true scope being indicated by the following claims.
[016] Referring now to FIG. 1, a block diagram of an exemplary system 100 for transmitting a video data associated with a vehicle operator of a vehicle is illustrated, in accordance with some embodiments of the present disclosure. In order to perform transmission of the video data, the system 100 may include a driver monitoring system (DMS) 102. As will be appreciated, the DMS 102 may be integrated within a dashboard of the vehicle. In some embodiment, the DMS 102 may be an external system (e.g., a plug and play device) that is coupled with to the vehicle via an associated port present within the vehicle. The vehicle may correspond to a private vehicle (e.g., a car) or a commercial vehicle (e.g., a taxi, a bus, a van, a truck, etc).
[017] To perform transmission of the video data, initially, the DMS 102 may be configured to monitor an activity of the vehicle operator in real-time via a camera 108. In an embodiment, the camera 108 may be wirelessly connected to the DMS 102 externally through wireless network interfaces such as Bluetooth®, infrared, or any other wireless radio communication known in the art. In another embodiment, the camera 108 may be in-built in the DMS 102. In yet another embodiment, the camera 108 may be connected to a communication pathway for one or more components of the DMS 102 for capturing and monitoring the activity of the vehicle operator. Examples of the camera 108 may include, but are not limited to, a forward-facing camera, an in-cabin camera, a driver-facing camera, a dual-lens dash, and the like. The camera 108 may be configured to monitor the activity of the vehicle operator during an active state of the vehicle. In an embodiment, the active state of the vehicle corresponds to an ‘on’ condition, i.e., when an ignition of the vehicle is in ‘on’ state.
[018] Further, in response to monitoring the activity, the DMS 102 may be configured to detect an anomaly in a state of the vehicle operator based on a set of parameters. In an embodiment, the set of parameters may include one or more vehicle parameter and one or more vehicle operator parameters. The one or more vehicle parameters, for example, may include an acceleration of the vehicle, sudden braking, and a vehicle speed, etc. Further, the examples of the one or more vehicle operator parameters, facial attributes, hand movements, head movements, posture and body language, and the like. Examples of the anomaly may include, but are not limited to, drowsiness, distraction, fatigue, impairment, and the like.
[019] Upon detecting the anomaly, the DMS 102, send an anomaly detection alert to a Telematic Control Unit (TCU) 112 over a Controller Area Network (CAN). As will be appreciated, the CAN may be a robust serial communication protocol primarily used for enabling efficient and reliable communication between various electronic components of the vehicle. Apart from the CAN, the anomaly detection alert may be sent by the DMS 102 to the TCU 112 any existing vehicle communication bus, operating on wireless protocols, including but not limited to a LIN (Local Interconnect Network), a FlexRay, a MOST (Media Oriented Systems Transport), and the like. Further, in some embodiments, the anomaly detection alert may be sent to the vehicle operator to alert the vehicle operator to take appropriate actions (e.g., stop the vehicle, call his emergency contact, etc.) or precautionary measures.
[020] In addition to sending the anomaly detection alert, the DMS 102 may be configured to capture the video data corresponding to the anomaly via the camera 108. In an embodiment, the DMS 102 may capture the video data for a predefined time interval (for example: 30 seconds). In an embodiment, the DMS 102 may store the video data in a local database of the DMS 102. The local database may reside within a memory 106 of the DMS 102. Further, upon capturing the video data, the DMS 102 may send a data transfer request to the TCU 112 to initiate transfer of the video data. The DMS 102 may send the data transfer request to the TCU 112 via the CAN. In an embodiment, the data transfer request is sent periodically at a pre-defined time interval (for example: after every 5 minutes).
[021] Further, in response to sending the data transfer request, the DMS 102 may receive an acceptance response from the TCU 112. The DMS 102 may receive the acceptance response based on one or more network parameters. Examples of the one or more network parameters may include, but are not limited to, signal strength, noise level, data transmission rate, and the like. In an embodiment, the DMS 102 may resend the data transfer request to the TCU 112 at the predefined time of interval, i.e., after 5 minutes, upon receiving a rejection response from the TCU 112. Further, upon receiving the acceptance response, the DMS 102 may transmit the video data to the TCU 112 via a wireless communication network. The wireless communication network may correspond to a network 110. In other words, the DMS 102 may transmit the video data to the TCU 112 over the network 110. In an embodiment, the TCU 112 may be implemented in the vehicle. The TCU 112 may be used to gather and manage data related to vehicle’s performance, location, and operational status. The TCU 112 may be responsible for collecting information from various sensors integrated within the vehicle and then transmitting this data to the DMS 102 for further processing. As will be appreciated, the wireless communication network may be periodically enabled and disable to save power consumption of the DMS 102 and the TCU 112.
[022] Further, upon receiving the video data, the TCU 112 may be configured to upload the video data to a cloud server 118. The TCU 112 may upload the video data to cloud server 118 via a communication Application Programming Interface (API) through the network 110. The network 110 can be implemented as one of the different types of networks, such as but not limited to, ethernet IP network, intranet, local area network (LAN), wide area network (WAN), the internet, Wi-Fi, LTE network, CDMA network, 5G and the like. Further, network 110 can either be a dedicated network or a shared network. The shared network represents an association of the different types of networks that use a variety of protocols, for example, Hypertext Transfer Protocol (HTTP), Transmission Control Protocol/Internet Protocol (TCP/IP), Wireless Application Protocol (WAP), and the like, to communicate with one another. Further, the network 110 can include a variety of network devices, including routers, bridges, servers, computing devices, storage devices, and the like.
[023] In an embodiment, the cloud server 118 may correspond to an on-premises server, e.g., a tata motors limited connected vehicle platform (TMLCVP), a google cloud platform, an amazon web services. Examples of the communication API may include, but are not limited to, a Twilio, a SendGrid, a Google Cloud Messaging (GCM), and a WebSocket. The cloud server 118 may be configured to store the video data on a database 120. As will be appreciated, the database 120 may be updated with a new video data captured for a new anomaly. Once the video data is stored on the database 120, an end-user (e.g., a vehicle owner, a family member, a friend, or the vehicle operator, etc.) may be able to retrieve and watch the video data via a user device 122 based on his requirement. This complete method for transmitting the video data of the vehicle operator is further explained in detail in conjunction with FIG. 2 – FIG. 5C.
[024] In order to transmit the video data to the TCU 112, the DMS 102 may include a processor 104 and a memory 106. The memory 106 may store instructions that, when executed by the processor 104, cause the processor 104 to perform various operations including monitoring the activity of the vehicle operator in real-time, detecting the anomaly in the state of the vehicle operator, sending the anomaly detection alert to the TCU 112 and capturing the video data corresponding to the anomaly, upon detecting the anomaly, sending the data transfer request to the TCU 112, receiving the acceptance response from the TCU 112, transmitting the video data to the TCU 112, and the like. Further, in order to upload the video data to the cloud server 118, the TCU 112 may include a processor 114 and a memory 116. The memory 116 may store instructions that, when executed by the processor 114, cause the processor 114 to perform various operations including sending a data uploading request to the cloud server 118, receiving a data uploading response from the cloud server 112, transmitting the video data to the cloud server 118, and the like.
[025] In an embodiment, examples of the processor 104 and the processor 114 may include but are not limited to, an Intel® Itanium® or Itanium 2 processor(s), or AMD® Opteron® or Athlon MP® processor(s), Motorola® lines of processors, Nvidia®, FortiSOC™ system on a chip processors or other future processors. Examples of the memory 106 and the memory 116 may include, but are not limited to, a flash memory, a Read Only Memory (ROM), a Programmable ROM (PROM), Erasable PROM (EPROM), and Electrically EPROM (EEPROM) memory. Further, examples of volatile memory may include but are not limited to, Dynamic Random Access Memory (DRAM), and Static Random-Access memory (SRAM).
[026] Referring now to FIG. 2, a functional block diagram 200 of various modules within the memory 106 of the driver monitoring system 102 configured for transmitting a video data associated with a vehicle operator of a vehicle is illustrated, in accordance with some embodiments of the present disclosure. FIG. 2 is explained in conjunction with FIG. 1. In an embodiment, the memory 106 may further include an anomaly detection module 202, an alerting module 204, a data processing module 206, and a data transferring module 208.
[027] Initially, the activity of the vehicle operator may be monitored in real-time via the camera 108. In an embodiment, the activity of the vehicle operator may be monitored during the active state of the vehicle. In an embodiment, the active state of the vehicle corresponds to an ‘on’ condition, i.e., when an ignition of the vehicle is in ‘on’ state. Further, the camera 108 may be configured to transfer information corresponding to the activity of the vehicle operator to the anomaly detection module 202 in real-time.
[028] Further, the anomaly detection module 202 may be configured to detect the anomaly in the state of the vehicle operator. The anomaly detection module 202 may detect the anomaly based on the set of parameters. The set of the parameters may include one or more vehicle parameters and one or more vehicle operator parameters. The one or more vehicle parameters, for example, may include an acceleration of the vehicle, sudden braking, a vehicle speed, etc. Further, the examples of the one or more vehicle operator parameters, may include, but are not limited to, facial attributes, hand movements, head movements, posture and body language, and the like. Examples of the anomaly may include, but are not limited to, drowsiness, distraction, fatigue, impairment, and the like.
[029] Upon detecting the anomaly, the anomaly detection module 202 may be configured to send the detected anomaly to the alerting module 204. In other words, the anomaly detection module 202 may send a signal indicating that the anomaly has been detected to the alerting module 204. Upon receiving the signal, the alerting module 204 may be configured to send the anomaly detection alert to the TCU 112. The alerting module 204 may send the anomaly detection alert over the CAN. As will be appreciated, the CAN may be a robust serial communication protocol primarily used for enabling efficient and reliable communication between various electronic components of the vehicle.
[030] Apart from the CAN, the anomaly detection alert may be sent over any existing vehicle communication bus, operating on wireless protocols, including but not limited to a LIN (Local Interconnect Network), a FlexRay, a MOST (Media Oriented Systems Transport), and the like. In addition to sending the anomaly detection alert, the alerting module 204 may be configured to capture the video data corresponding to the anomaly via the camera 108. The alerting module 204 may capture the video data corresponding to the anomaly for the predefined time interval. Further, the alerting module 204 may store the video data in a local database (not shown) residing within the memory 106. Further, in some embodiments, the alerting module 204 may send the anomaly detection alert to the vehicle operator to alert the vehicle operator to take appropriate actions (e.g., stop the vehicle, call his emergency contact, etc.) or precautionary measures. Further, the alerting module 204 may be configured to notify the data processing module 206 that the anomaly detection alert has been sent to the TCU 112.
[031] The data processing module 206 may be configured to send the data transfer request to the TCU 112 via the CAN. In an embodiment, the data transfer request is sent periodically at the pre-defined time interval (for example: after every 5 minutes). Upon sending the data transfer request, the data processing module 206 may be configured to receive one of an acceptance response or a rejection response from the TCU 112. The data processing module 206 may receive one of the acceptance request or the rejection request from the TCU 112 based on one or more network parameters. Examples of the one or more network parameters may include, but are not limited to, signal strength, noise level, data transmission rate, and the like. In one embodiment, upon receiving the acceptance response, the data processing module 206 may be configured to indicate about the acceptance response to the data transferring module 208. In another embodiment, upon receiving the rejection response, the data processing module 206 may be configured to resend the data transfer request to the TCU 112 at the pre-defined time interval (e.g., after every 5 minutes). It should be noted that the data processing module 206 may be configured to keep on resending the data transfer request to the TCU 112 until the acceptance response is received from the TCU 112.
[032] Further, upon receiving the indication about the acceptance request from the data processing module 206, the data transferring module 208 may be configured to transfer the video data to the TCU 112. The data transferring module 208 may transmit the video data to the TCU 112 via the wireless communication network. Further, the TCU 112 may transfer the video data to the cloud server 118. Furthermore, the end-user (e.g., the vehicle owner, the family member, a friend, or the vehicle operator, etc.) may be able to retrieve and watch the video data via the user device 122 based on his requirement.
[033] It should be noted that all such aforementioned modules 202 – 208 may be represented as a single module or a combination of different modules. Further, as will be appreciated by those skilled in the art, each of the modules 202 – 208 may reside, in whole or in parts, on one device or multiple devices in communication with each other. In some embodiments, each of the modules 202 – 208 may be implemented as dedicated hardware circuit comprising custom application-specific integrated circuit (ASIC) or gate arrays, off-the-shelf semiconductors such as logic chips, transistors, or other discrete components. Each of the modules 202 – 208 may also be implemented in a programmable hardware device such as a field programmable gate array (FPGA), programmable array logic, programmable logic device, and so forth. Alternatively, each of the modules 202 – 208 may be implemented in software for execution by various types of processors (e.g., processor 104). An identified module of executable code may, for instance, include one or more physical or logical blocks of computer instructions, which may, for instance, be organized as an object, procedure, function, or other construct. Nevertheless, the executables of an identified module or component need not be physically located together, but may include disparate instructions stored in different locations which, when joined logically together, include the module and achieve the stated purpose of the module. Indeed, a module of executable code could be a single instruction, or many instructions, and may even be distributed over several different code segments, among different applications, and across several memory devices.
[034] As will be appreciated by one skilled in the art, a variety of processes may be employed for transmitting the video data associated with the vehicle operator. For example, the exemplary system 100 may assist transmission of the video data by the processes discussed herein. In particular, as will be appreciated by those of ordinary skill in the art, control logic and/or automated routines for performing the techniques and steps described herein may be implemented by the system 100 either by hardware, software, or combinations of hardware and software. For example, suitable code may be accessed and executed by the one or more processors on the system 100 to perform some or all of the techniques described herein. Similarly, application specific integrated circuits (ASICs) configured to perform some, or all of the processes described herein may be included in the one or more processors on the system 100.
[035] Referring now to FIG. 3, a flow diagram of a method 300 of transmitting a video data associated with a vehicle operator of a vehicle is illustrated via a flowchart, in accordance with some embodiments of the present disclosure. FIG. 3 is explained in conjunction with FIGS. 1 and 2. Each step of the method 300 may be implemented by the processor 104 of the DMS 102.
[036] In order to transmit the video data, at step 302, an activity of the vehicle operator may be monitored in real-time. In an embodiment, the activity of the vehicle operator may be monitored in an active state of the vehicle. The vehicle may correspond to a private vehicle (e.g., a car) or a commercial vehicle (e.g., a taxi, a bus, a van, a truck, etc). With reference to FIG. 1, the activity of the vehicle operator may be monitored via the camera 108. In an embodiment, the camera 108 may be wirelessly connected to the DMS 102 externally through the wireless network interfaces such as Bluetooth®, infrared, or any other wireless radio communication known in the art. In another embodiment, the camera 108 may be in-built in the DMS 102. In yet another embodiment, the camera 108 may be connected to a communication pathway for one or more components of the DMS 102 for capturing and monitoring the activity of the vehicle operator. Examples of the camera 108 may include, but are not limited to, a forward-facing camera, an in-cabin camera, a driver-facing camera, a dual-lens dash, and the like. Further, the active state of the vehicle corresponds to an ‘on’ condition, i.e., when an ignition of the vehicle is in ‘on’ state.
[037] Further, at step 304, an anomaly may be detected in a state of the vehicle operator in response to monitoring the activity of the vehicle operator. The anomaly may be detected based on a set of parameters. In an embodiment the set of parameters may include one or more vehicle parameters and one or more vehicle operator parameters. The one or more vehicle parameters, for example, may include an acceleration of the vehicle, sudden braking, a vehicle speed, etc. Further, the examples of the one or more vehicle operator parameters, may include, but are not limited to, facial attributes, hand movements, head movements, posture and body language, and the like. Examples of the anomaly may include, but are not limited to, drowsiness, distraction, fatigue, impairment, and the like.
[038] Further, upon detecting the anomaly as mentioned via step 306, at step 308, an anomaly detection alert may be sent to the TCU 112. The anomaly detection alert may be sent to the TCU 112 over the CAN. Apart from the CAN, the anomaly detection alert may be sent to the TCU 112 via any existing vehicle communication bus, operating on wireless protocols, including but not limited to, a LIN (Local Interconnect Network), a FlexRay, a MOST (Media Oriented Systems Transport), and the like. In some embodiments, the anomaly detection alert may also be sent to the vehicle operator. This is done to alert the vehicle operator to take appropriate action (e.g., stopping the vehicle, call his emergency contact, etc.).
[039] In addition to sending the anomaly detection alert, at step 310, the video data corresponding to the anomaly may be captured. The video data may be captured for a predefined time interval (e.g., 30 seconds). With reference to FIG. 1, the video data may be captured via the camera 108. Once the video data is captured, the video data may be stored within a local database in response to the video data capturing. The local database may correspond to a temporary database. This local database may reside within the memory 106 of the DMS 102.
[040] Once the video data is captured and stored, at step 312, the data transfer request may be sent to the TCU 112. The data transfer request may be sent to initiate transfer of the video data. In an embodiment, the data transfer request may be sent periodically at the pre-defined time interval (e.g., after every 5 minutes). It should be noted that, the data transfer request may be sent to the TCU 112 over the CAN. In one embodiment, in response to sending the data transfer request, at step 314, the acceptance response may be received from the TCU 112 based on one or more network parameters. Examples of the one or more network parameters may include, but are not limited to, signal strength, noise level, data transmission rate, and the like.
[041] Further upon receiving the acceptance response, at step 316, the video data may be transmitted to the TCU 112. The video data may be transmitted to the TCU 112 via the wireless communication network (i.e., the network 110). In another embodiment, in response to the sending the data transfer request, a rejection response may be received from the TCU 112. The rejection response may be received based on the one or more network parameters. In case of receiving the rejection response, the data transfer request may be resent to the TCU 112 at the pre-defined time interval. In other words, in case the rejection response is received from the TCU, the data transfer request may be resent to the TCU after 5 minutes from a time period of a previous data transfer request. As would be appreciated, the data transfer request may be iteratively sent after every 5 minutes, until the acceptance response is received from the TCU 112.
[042] Once the TCU 112 receives the video data, the TCU 112 may upload the video data to a cloud server (i.e., the cloud server 118). The TCU may upload the video data via a communication API. The cloud server 118 may be configured to store the video data 120 in the database 120 for future reference. Further, once the video data is uploaded to the cloud server 118, the video data may be deleted from the local database. The cloud server 118 may correspond to an on-premises server, e.g., a tata motors limited connected vehicle platform (TMLCVP), a google cloud platform, an amazon web services. Examples of the communication API may include, but are not limited to, a Twilio, a SendGrid, a Google Cloud Messaging (GCM), and a WebSocket. A method of uploading the video data to the cloud server is further explained in detail in conjunction with FIGS. 4A and 4B and FIGS. 5A and 5B. Further, an end-user (e.g., a vehicle owner, a family member, a friend, or the vehicle operator, etc.) may be able to request the cloud server 118 to display the video data via an application integrated within the user device 122 (e.g., a smartphone) based on his requirement. This complete method is further explained in detail in conjunction with FIGS. 4A – 5C.
[043] Referring now to FIGS. 4A and 4B, a flow diagram of a detailed exemplary process 400 of transmitting the video data associated with the vehicle operator of the vehicle is illustrated via a flowchart, in accordance with some embodiments of the present disclosure. FIGS. 4A and 4B are explained in conjunction with FIGS. 1, 2, and 3. In an embodiment, the exemplary process 400 may include a plurality of steps that may be performed by the processor 104 and processor 114 to perform transmission of the video data.
[044] Initially, at step 402, the activity of the vehicle operator may be monitored in real-time during the active state of the vehicle. The monitoring of the activity may be done using the camera 108 of the DMS 102. In an embodiment, the vehicle, for example, may correspond to a private vehicle (e.g., a car) or a commercial vehicle (e.g., a taxi, a bus, a van, a truck, etc). Further, the active state of the vehicle corresponds to an ‘on’ condition, i.e., when an ignition of the vehicle is in ‘on’ state. Further, based on the monitoring, at step 404, the anomaly may be detected in the state of the vehicle operator. In other words, based on monitoring the activity of the vehicle operator via the camera 108, the DMS 102 may detect the anomaly in the state of the vehicle operator. In an embodiment, the anomaly may be detected based on the set of parameters. Examples of the anomaly may include, but are not limited to, drowsiness, distraction, fatigue, impairment, and the like.
[045] In an embodiment, the set of parameters may include one or more vehicle parameters and one or more vehicle operator parameters. The one or more vehicle parameters, for example, may include, an acceleration of the vehicle, sudden braking, a vehicle speed, etc. Further, the examples of the one or more vehicle operator parameters, may include, but are not limited to, facial attributes, hand movements, head movements, posture and body language, and the like. For example, in case the vehicle operator is feeling drowsy, then the anomaly, i.e., the drowsiness may be detected based a vehicle parameter, e.g., a frequent or exaggerated steering correction, a speed of the vehicle to be above a pre-defined threshold limit, etc., and a facial attribute, e.g., eyelid drooping, prolonged blinking, head movements, etc.
[046] In order to detect the anomaly, at step 406, a check may be performed to determine if the anomaly is detected or not. In one embodiment, based on the check performed, when no anomaly is detected, then the step 402 may be executed. In other words, when no anomaly is detected, then the camera 108 may continue monitoring the activity of the vehicle operator. In another embodiment, based on the check performed, when the anomaly is detected, then at step 408, the anomaly detection alert may be sent to the TCU 112 over the CAN. In some embodiments, the anomaly detection alert may be sent to the vehicle operator to alert the vehicle operator to take appropriate actions (e.g., stop the vehicle, call his emergency contact, etc.) or precautionary measures.
[047] In addition to sending the anomaly detection alert, at step 410, the video data corresponding to the anomaly may be captured for the predefined tine interval. For example, the pre-defined interval may be set for 30 seconds. Further, the captured video data may be stored within the local database. The local database may reside within the memory 106 of the DMS 102. Once the video data is captured and stored, at step 412, the data transfer request may be sent to the TCU 112 to initiate transfer of the video data. In an embodiment, the data transfer request may be sent to the TCU 112 via the CAN. Further, upon sending the data transfer request, at step 414, a check may be performed to determine whether the acceptance response is received from the TCU 112 in response to sending the data transfer request.
[048] In one embodiment, based on the check performed, if the acceptance response is determined to be received, then step 416 may be executed. In another embodiment, when the rejection response is received instead of the acceptance response, then step 412 may be re-executed. In other words, if the rejection response is received, the data transfer request may be resent after an expiry of the pre-time time interval (i.e., after the expiry of 5 minutes). As will be appreciated, the step 412 may be iteratively executed until the acceptance response is received. It should be noted that, the acceptance response or the rejection response may be received from the TCU 112 based on the one or more network parameters. Examples of the one or more network parameters may include, but are not limited to, signal strength, noise level, data transmission rate, and the like.
[049] Further, upon receiving the acceptance response, at step 416, the video data may be transmitted to the TCU 112 via the wireless communication network. In other words, the DMS 102 may transmit the video data to the TCU 112 over the wireless communication network, i.e., the network 110. Once the TCU 112 receives the video data from the DMS 102, the TCU 112 may upload the video data to the cloud server 118. The TCU 112 may upload the video data to the cloud server 118 via the communication API. In an embodiment, the cloud server 118 may correspond to an on-premises server, e.g., a tata motors limited connected vehicle platform (TMLCVP), a google cloud platform, an amazon web services. Examples of the communication API may include, but are not limited to, a Twilio, a SendGrid, a Google Cloud Messaging (GCM), and a WebSocket. As will be appreciated, the wireless communication network may be periodically enabled and disable to save power consumption of the DMS 102 and the TCU 112.
[050] In order to upload the video data to the cloud server 118, at step 418, a data uploading request may be sent to the cloud server 118 to initiate uploading of the video data. In an embodiment, the data uploading request may be sent based on the one or more network parameters. In response to sending the data uploading request, at step 420, a check may be performed to identify whether a data uploading response is received or not. In other words, the TCU 112 may perform the check to determine whether the data uploading response is received from the cloud server 118 or not, in response to sending the data uploading request. In an embodiment, the data uploading response may include a Uniform Resource Locator (URL) link of a destination address (e.g., a space within the database 120) where the video data needs to be stored. Further based on the check performed at step 420, in one embodiment, if the data uploading response is received, then step 422 may be executed. In another embodiment, based on the check performed at step 420, if the data uploading response is not received from the cloud server 118, then the TCU 112 may wait unit the data uploading response is received.
[051] At step 422, the video data may be transmitted to the cloud server 118 via the communication API. Further, at step 424, a check may be performed to determine whether a transmission of the video data is successful, i.e., a video data upload successful. In other words, the check may be performed to determine whether the transmission of the video data is the successful transmission. In one embodiment, based on the check performed at step 424, if the transmission of the video data is the successful transmission, then step 426 may be executed. In another embodiment, based on the check performed at step 424, if the transmission of the video data is the unsuccessful transmission, then the TCU 112 may wait unit the video data is successfully transmitted to the cloud server 118.
[052] Further, at step 426, a notification including a message of completion of uploading the video data may be sent to the cloud server 118. In other words, once the TCU 112 may have sent the video data to the cloud server 118 successfully, then the TCU 112 may send the notification including the message of the successful transmission to the cloud server 118. In response to sending the notification, at step 428, an acknowledgement including one of a successful upload status or an unsuccessful upload status may be received by the TCU 112 from the cloud server 118. In one embodiment, when the acknowledgement includes the successful upload status, then at step 430, the acknowledgement including the successful upload status may be transmitted to the DMS 102. Further, upon receiving the acknowledgement of the successful upload status, at step 432, the video data may be deleted from the local database. In particular, the DMS 102 may delete the video data from the local database upon receiving the acknowledgement of the successful upload status from the TCU 112. Thereafter, the process 400 of transmission of the video data may end at step 434.
[053] In another embodiment, when the acknowledgement includes the unsuccessful upload status as mentioned via step 436, at step 438, the acknowledgement may be transmitted to the DMS 102. Further, upon receiving the acknowledgement of the unsuccessful upload status, at step 440, the video data may be retained within the local database. In particular, the DMS 102 may retain the video data within the local database. In addition, upon receiving the acknowledgement including the unsuccessful upload status, at step 442, the data uploading request may be resend to the cloud server 118 based on the one or more network parameters. Further, after resending the data uploading request, the steps 420 to step 440 may be re-executed.
[054] Referring now to FIGS. 5A, 5B, and 5C, an exemplary flow diagram 500 depicting a flow of communication between the DMS 102 and the TCU 112 is illustrated, in accordance with some embodiments of the present disclosure. FIGS. 5A, 5B, and 5C are explained in conjunction with FIGS. 1 – 4B.
[055] At step 502, the DMS 102 may start monitoring the activity of the vehicle operator in real-time upon detecting the active state of the vehicle. The active state of the vehicle may correspond to an ‘on’ condition, i.e., when an ignition of the vehicle is in ‘on’ state. The DMS 102 may monitor the activity via the camera 108. Further, based on the monitoring, at step 504, the DMS 102 may check if any anomaly is detected in the state of the vehicle operator. In one embodiment, if no anomaly is detected, then the DMS 102 may keep monitoring the activity of the vehicle operator via the camera 108. In another embodiment, if the anomaly is detected in the state of the vehicle operator, then at step 506, an alert, i.e., the anomaly detection alert may be sent to the TCU 112 over the CAN as depicted via a single head solid arrow line. Further, at step 508, the DMS 102 may start capturing the video data corresponding to the anomaly for 30 seconds, i.e., the pre-defined time interval. Further, the DMS 102 may store the video data in the local database.
[056] Further, at step 510, the DMS 102 may send the data transfer request to the TCU 112 to initiate the transfer of the video data. Upon receiving the data transfer request, at step 512, the TCU 112 may check network condition based on the one or more network parameters. In one embodiment, based on the check performed for the network condition (e.g., a check for the signal strength), if the network condition is good, then at step 514, the DMS 102 may receive the acceptance response from the TCU 112. In another embodiment, based on the check performed for the network condition, if the network condition is bad, then at step 514`, the DMS 102 may receive the rejection response from the TCU 112.
[057] Upon receiving the acceptance response, at step 516, the DMS 102 may send the video data to the TCU 112 via the wireless communication network. Upon receiving the video data, at step 518, the TCU 112 may check the network condition based on the one or more network parameters to upload the video data to the cloud server 118. Further, based on the check performed, if the network condition is bad, i.e., the signal strength is week, the TCU 112 may store the video data temporarily and wait for the network condition to be good as mentioned via the step 520. Furthermore, based on the check performed, if the network condition is good, i.e., the signal strength is strong, then at step 522, the TCU 112 may send the data uploading request to the cloud server 118 to initiate the transmission of the video data to the cloud server 118. In response to sending the data uploading request, at step 524, the TCU 112 may receive the data uploading response from the cloud server 118 based on the network conditions determined considering the one or more network parameters.
[058] Upon receiving the data uploading response, at step 526, the TCU 112 may transmit the video data to the cloud server 118. Further, the cloud server 118 may store the video data within the database 120. Further, upon the successful transmission of the video data as mentioned via the step 528, at step 530, the TCU 112 may send the notification including the message of completion of uploading the video data at the cloud server 118, to the cloud server 118. In other words, once the TCU 112 may have uploaded the video data to the cloud server 118 successfully, then the TCU 112 may send the notification including the message of the successful transmission to the cloud server 118.
[059] In response to sending the notification, at step 532, the TCU 112 may receive the acknowledgement including one of the successful upload status or the unsuccessful upload status from the cloud server 118. In one embodiment, when the acknowledgement includes the successful upload status, then at step 534, the acknowledgement including the successful upload status, i.e., a video upload complete, may be transmitted to the DMS 102. Further, upon receiving the acknowledgement of the successful upload status, the DMS 102 may delete the video data from the local database. In another embodiment, when the acknowledgement includes the unsuccessful upload status, then the TCU 112 may transmit the acknowledgement including the unsuccessful upload status to the DMS 102. Further, upon receiving the acknowledgement of the unsuccessful upload status, the DMS 102 may retain the video data within the local database. In addition, upon receiving the acknowledgement including the unsuccessful upload status, the TCU 112 may resend the data uploading request to the cloud server 118 based on analysis of the one or more network parameters until the acknowledgment including the successful upload status is received by the TCU112. Further, at step 536, the end-user (e.g., the vehicle owner, the family member, the friend, or the vehicle operator, etc.) may send a request to the cloud server 118 to display the video data via the application integrated within the user device 122 (e.g., the smartphone) based on his requirement.
[060] As will be also appreciated, the above-described techniques may take the form of computer or controller implemented processes and apparatuses for practicing those processes. The disclosure can also be embodied in the form of computer program code containing instructions embodied in tangible media, such as floppy diskettes, solid state drives, CD-ROMs, hard drives, or any other computer-readable storage medium, wherein, when the computer program code is loaded into and executed by a computer or controller, the computer becomes an apparatus for practicing the invention. The disclosure may also be embodied in the form of computer program code or signal, for example, whether stored in a storage medium, loaded into and/or executed by a computer or controller, or transmitted over some transmission medium, such as over electrical wiring or cabling, through fiber optics, or via electromagnetic radiation, wherein, when the computer program code is loaded into and executed by a computer, the computer becomes an apparatus for practicing the invention. When implemented on a general-purpose microprocessor, the computer program code segments configure the microprocessor to create specific logic circuits.
[061] Various embodiments provide a method and a system for transmitting a video data associated with a vehicle operator of a vehicle. The disclosed method and system may monitor an activity of the vehicle operator in real-time during an active state of the vehicle. Further, the disclosed method and system detecting an anomaly in a state of the vehicle operator based on a set of parameters in response to monitoring the activity. Upon detecting the anomaly, the disclosed method and system may send an anomaly detection alert to a TCU over a CAN. In addition to sending the anomaly detection alert, the disclosed method and system may capture the video data corresponding to the anomaly for a predefined time interval. Further, the disclosed method and system may send a data transfer request to the TCU via the CAN to initiate transfer of the video data. The data transfer request may be sent periodically at a pre-defined time interval. Thereafter, the disclosed method and system may receive an acceptance response from the TCU based on one or more network parameters in response to sending the data transfer request. Lastly, the disclosed method and system may transmit the video data to the TCU via a wireless communication network upon receiving the acceptance response.
[062] Thus, the disclosed method and system try to overcome the technical problem of transmitting a video data associated with a vehicle operator of a vehicle. The disclosed method and system may provide a secure and stable means of communication and transmission of video data. Further, the disclosed method and system may enhance overall vehicle safety by providing real-time alerts to vehicle operators upon detecting any distractions and drowsiness, thereby reducing risk of accidents caused by negligence of the vehicle operators. Furthermore, the disclosed method and system may send the real-time alerts and notifications to emergency contacts of a vehicle operator (e.g., a vehicle owner, a family member, a friend, and the like) regarding status of a vehicle and behaviour of the vehicle operator remotely. This may enable emergency contacts to take action promptly if necessary or in case of any emergency. In addition, the disclosed method and the system may facilitate historical event tracking. This may enable the vehicle operators or vehicle owners to gain valuable insights into driving behaviour trends and patterns, or any incident that happened in past.
[063] In light of the above-mentioned advantages and the technical advancements provided by the disclosed method and system, the claimed steps as discussed above are not routine, conventional, or well understood in the art, as the claimed steps enable the following solutions to the existing problems in conventional technologies. Further, the claimed steps clearly bring an improvement in the functioning of the device itself as the claimed steps provide a technical solution to a technical problem.
[064] The specification has described a method and a system for transmitting a video data associated with a vehicle operator of a vehicle. The illustrated steps are set out to explain the exemplary embodiments shown, and it should be anticipated that ongoing technological development will change the manner in which particular functions are performed. These examples are presented herein for purposes of illustration, and not limitation. Further, the boundaries of the functional building blocks have been arbitrarily defined herein for the convenience of the description. Alternative boundaries can be defined so long as the specified functions and relationships thereof are appropriately performed. Alternatives (including equivalents, extensions, variations, deviations, etc., of those described herein) will be apparent to persons skilled in the relevant art(s) based on the teachings contained herein. Such alternatives fall within the scope and spirit of the disclosed embodiments.
[065] Furthermore, one or more computer-readable storage media may be utilized in implementing embodiments consistent with the present disclosure. A computer-readable storage medium refers to any type of physical memory on which information or data readable by a processor may be stored. Thus, a computer-readable storage medium may store instructions for execution by one or more processors, including instructions for causing the processor(s) to perform steps or stages consistent with the embodiments described herein. The term “computer-readable medium” should be understood to include tangible items and exclude carrier waves and transient signals, i.e., be non-transitory. Examples include random access memory (RAM), read-only memory (ROM), volatile memory, non-volatile memory, hard drives, CD ROMs, DVDs, flash drives, disks, and any other known physical storage media.
[066] It is intended that the disclosure and examples be considered as exemplary only, with a true scope and spirit of disclosed embodiments being indicated by the following claims.
, Claims:1. A method of transmitting a video data associated with a vehicle operator of a vehicle, the method comprising:
monitoring (302), via a camera (108) of a Driver Monitoring System (DMS) (102), an activity of the vehicle operator in real-time during an active state of the vehicle;
detecting (304), by the DMS (102), an anomaly in a state of the vehicle operator based on a set of parameters in response to monitoring the activity;
upon detecting the anomaly (306),
sending (308), by the DMS (102), an anomaly detection alert to a Telematic Control Unit (TCU) (112) over a Controller Area Network (CAN); and
capturing (310), via the camera (108) of the DMS (102), the video data corresponding to the anomaly for a predefined time interval;
sending (312), by the DMS (102), a data transfer request to the TCU (112) via the CAN to initiate transfer of the video data, wherein the data transfer request is sent periodically at a pre-defined time interval;
receiving (314), by the DMS (102), an acceptance response from the TCU (112) based on one or more network parameters in response to sending the data transfer request; and
transmitting (316), by the DMS (102), the video data to the TCU (112) via a wireless communication network upon receiving the acceptance response.
2. The method as claimed in claim 1, further comprising:
storing, by the DMS (102), the video data in a local database of the DMS (102) in response to capturing the video data.
3. The method as claimed in claim 1, wherein the set of parameters comprises one or more vehicle parameters and one or more vehicle operator parameters.
4. The method as claimed in claim 1, further comprising:
receiving, by the DMS (102), a rejection response from the TCU (112) based on the one or more network parameters in response to sending the data transfer request; and
resending, by the DMS (102), the data transfer request to the TCU (112) at the pre-defined time interval.
5. The method as claimed in claim 1, further comprising:
uploading, by the TCU (112), the video data to a cloud server (118) via a communication Application Programming Interface (API).
6. The method as claimed in claim 5, wherein uploading the video data to the cloud server (118) comprises:
sending, by the TCU (112), a data uploading request to the cloud server (118) to initiate uploading of the video data based on the one or more network parameters;
receiving, by the TCU (112), a data uploading response from the cloud server (118) in response to sending the data uploading request; and
transmitting, by the TCU (112), the video data to the cloud server (118) upon receiving the data uploading response.
7. The method as claimed in claim 6, further comprising:
upon a successful transmission of the video data, sending, by the TCU (112) to the cloud server (118), a notification comprising a message of completion of uploading the video data at the cloud server; and
receiving, by the TCU (112) from the cloud server (118), an acknowledgement comprising one of a successful upload status or an unsuccessful upload status, in response to sending the notification.
8. The method as claimed in claim 7, further comprising:
upon receiving the acknowledgement comprising the successful upload status,
transmitting, by the TCU (112), the acknowledgement to the DMS (102); and
deleting, by the DMS (102), the video data from the local database upon receiving the acknowledgement.
9. The method as claimed in claim 7, further comprising:
upon receiving the acknowledgement comprising the unsuccessful upload status,
transmitting, by the TCU (112), the acknowledgement to the DMS (102); and
retaining, by the DMS (102), the video data within the local database.
10. The method as claimed in claim 7, further comprising:
upon receiving the acknowledgement comprising the unsuccessful upload status, resending, by the TCU (112), the data uploading request to the cloud server (118) to initiate uploading of the video data based on the one or more network parameters.
11. A system of transmitting a video data associated with a vehicle operator of a vehicle, the system comprising:
a processor (104);
a memory (106) storing instructions that, when executed by the processor (104), cause the processor (104) to perform operations comprising:
monitor (302), via a camera (108), an activity of the vehicle operator in real-time during an active state of the vehicle;
detect (304) an anomaly in a state of the vehicle operator based on a set of parameters in response to monitoring the activity;
upon detecting the anomaly (306),
send (308) an anomaly detection alert to a Telematic Control Unit (TCU) (112) over a Controller Area Network (CAN); and
capture (310), via the camera (108), the video data corresponding to the anomaly for a predefined time interval;
send (312) a data transfer request to the TCU (112) via the CAN, to initiate transfer of the video data, wherein the data transfer request is sent periodically at a pre-defined time interval;
receive (314) an acceptance response from the TCU (112) based on one or more network parameters in response to sending the data transfer request; and
transmit (316) the video data to the TCU (112) via a wireless communication network upon receiving the acceptance response.
| # | Name | Date |
|---|---|---|
| 1 | 202421020497-STATEMENT OF UNDERTAKING (FORM 3) [19-03-2024(online)].pdf | 2024-03-19 |
| 2 | 202421020497-REQUEST FOR EXAMINATION (FORM-18) [19-03-2024(online)].pdf | 2024-03-19 |
| 3 | 202421020497-PROOF OF RIGHT [19-03-2024(online)].pdf | 2024-03-19 |
| 4 | 202421020497-FORM 18 [19-03-2024(online)].pdf | 2024-03-19 |
| 5 | 202421020497-FORM 1 [19-03-2024(online)].pdf | 2024-03-19 |
| 6 | 202421020497-FIGURE OF ABSTRACT [19-03-2024(online)].pdf | 2024-03-19 |
| 7 | 202421020497-DRAWINGS [19-03-2024(online)].pdf | 2024-03-19 |
| 8 | 202421020497-DECLARATION OF INVENTORSHIP (FORM 5) [19-03-2024(online)].pdf | 2024-03-19 |
| 9 | 202421020497-COMPLETE SPECIFICATION [19-03-2024(online)].pdf | 2024-03-19 |
| 10 | Abstract1.jpg | 2024-05-16 |
| 11 | 202421020497-FORM-26 [16-07-2024(online)].pdf | 2024-07-16 |
| 12 | 202421020497-Proof of Right [04-06-2025(online)].pdf | 2025-06-04 |