Abstract: A vehicle data relation device includes an internal audio/image data analyzer, configured to receive first data representing at least one of audio from within the vehicle or an image from within the vehicle; identify within the first data second data representing an audio indicator or an image indicator, wherein the audio indicator is human speech associated with a significance of an object external to the vehicle, and wherein the image indicator is an action of a human within the vehicle associated with a significance of an object external to the vehicle; an external image analyzer, configured to receive third data representing an image of a vicinity external to the vehicle; identify within the third data an object corresponding to at least one of the audio indicator or the video indicator; and an object data generator, configured to generate data corresponding to the object.
Claims:1. A vehicle data relation device, comprising:
an internal audio/image data analyzer, configured to
identify within first data representing at least one of audio or an image from within the vehicle,
identify second data representing an audio indicator or an image indicator,
wherein the audio indicator is human speech, and wherein the image indicator represents an action of a human within the vehicle;
an external image analyzer, configured to:
identify, within third data representing an image of a vicinity external to the vehicle, an object corresponding to at least one of the audio indicator or the image indicator; and
an object data generator, configured to generate object data to classify the third data.
, Description:RELATED APPLICATION
[0001] The present application claims priority to U.S. Non-Provisional Patent Application No. 17/211,930 filed March 25, 2021 and titled “VEHICLE DATA RELATION DEVICE AND METHODS THEREFOR” the entire disclosure of which is hereby incorporated by reference.
Technical Field
[0002] Various aspects of the disclosure relate to speech recognition and speech-based object recognition from image data.
Background
[0003] Autonomous vehicle and partially autonomous vehicles typically rely on a plurality of sensors to detect information about the vehicles’ surroundings and make driving decisions based on this information. Such sensors may include, for example, a plurality of cameras, one or more Light Detection and Ranging (LIDAR) systems, one or more Radio Detection and Ranging (Radar) systems, microphones, accelerometers, and/or position sensors. As these sensors generate substantial quantities of data, autonomous vehicles may be required to parse through these large quantities of data for their diving operations.
[0004] One particular challenge in processing these data is the ability to discern between relevant sensor data and irrelevant sensor data. Artificial neural networks (ANNs) are increasingly used for processing sensor data and reaching driving decisions. Artificial neural networks may be particularly well-suited to this task, since they may be configured to receive and rapidly parse through large quantities of data.
[0005] Successful implementation of ANNs for such parsing of sensor data, however, requires substantial training. One particularly challenging task is to teach ANNs to distinguish between relevant sensor data and irrelevant sensor data. Otherwise stated, whereas human drivers may be able to distinguish with relative ease between relevant visual or auditory information, an ANN, without additional training, may be unable to do so.
Brief Description of the Drawings
[0006] In the drawings, like reference characters generally refer to the same parts throughout the different views. The drawings are not necessarily to scale, emphasis instead generally being placed upon illustrating the exemplary principles of the disclosure. In the following description, various exemplary aspects of the disclosure are described with reference to the following drawings, in which:
FIG. 1 shows an exemplary autonomous vehicle in accordance with various aspects of the present disclosure;
FIG. 2 shows various exemplary electronic components of a safety system of the vehicle in accordance with various aspects of the present disclosure;
FIG. 3 depicts an exemplary vehicle configured with a plurality of sensors;
FIG. 4 depicts a vehicle interior 400 according to an aspect of the disclosure;
FIG. 5 depicts an object labeling algorithm based on human speech;
FIG. 6 depicts an example of gaze being used to identify an object;
FIG. 7 depicts an eye gaze detector, according to an aspect of the disclosure;
FIG. 8 shows a calculation of mirror gaze according to an aspect of the disclosure;
FIG. 9 depicts a hand gesture detector, which may be configured to detect one or more hand gestures or hand positions;
FIG. 10 depicts a data synthesizer and labeler according to an aspect of the disclosure;
FIG. 11 depicts a data storage device, according to an aspect of the disclosure;
FIG. 12 depicts a vehicle data relation device, according to an aspect of the disclosure; and
FIG. 13 depicts a method of vehicle data relation.
Description
[0007] The following detailed description refers to the accompanying drawings that show, by way of illustration, exemplary details and aspects in which aspects of the present disclosure may be practiced.
[0008] The word "exemplary" is used herein to mean "serving as an example, instance, or illustration". Any aspect or design described herein as "exemplary" is not necessarily to be construed as preferred or advantageous over other aspects or designs.
[0009] Throughout the drawings, it should be noted that like reference numbers are used to depict the same or similar elements, features, and structures, unless otherwise noted.
| # | Name | Date |
|---|---|---|
| 1 | 202244010174-FORM 1 [25-02-2022(online)].pdf | 2022-02-25 |
| 2 | 202244010174-DRAWINGS [25-02-2022(online)].pdf | 2022-02-25 |
| 3 | 202244010174-DECLARATION OF INVENTORSHIP (FORM 5) [25-02-2022(online)].pdf | 2022-02-25 |
| 4 | 202244010174-COMPLETE SPECIFICATION [25-02-2022(online)].pdf | 2022-02-25 |
| 5 | 202244010174-FORM-26 [19-04-2022(online)].pdf | 2022-04-19 |
| 6 | 202244010174-FORM 3 [25-08-2022(online)].pdf | 2022-08-25 |