Abstract: The present invention relates to systems and methods which are capable of detecting any impulse sound event and classifying the impulse sound as desired acoustic event and estimating the direction and the range of the source of the impulse sound event. The microphone array units (102) receive acoustic signals from sources. Each microphone array unit (102) converts the acoustic signals to electrical signals. The processing unit (104) processes the electrical signals A digitizer unit (106) converts the electrical signals to digital data. A detection unit (108) detect impulse sound. The classification unit (110) classifies the detected impulse sound. A direction estimation unit (112) estimates direction of the classified sound. A range estimation unit (118) estimates geographical coordinates of the source of the classified impulse sound.
DESC:TECHNICAL FIELD
[0001] The present invention relates generally to detection and localization of acoustic events. The present invention, more particularly, relates to a system and method for detecting and localizing an impulse sound event.
BACKGROUND
[0002] There are many conventional technologies available for detecting, identifying and locating the source of an acoustic event. One of such technique is provided in the US6847587B2 which discloses a system and method for detecting, identifying and locating the source of an acoustic event particularly in a metropolitan area, where individual sensors are dispersed over an area which are capable of estimating the time of arrival of impulse sound and send the same to a centralized station and the data processing happens at the centralized location, where each sensor consists of a single microphone, a processor and a synchronized clock. The disadvantages of the system and method provided in the US6847587B2 is that the dispersed sensor unit is not capable of estimating the direction of the acoustic event on its own. The dispersed sensor unit sends the time of arrival of impulse sound to the centralized station where the location of the acoustic event is estimated. Since calculations for localization happen at a centralized place, the time of arrival information sent by all the units should be highly synchronized. Therefore, each unit should be synchronized very precisely as it will affect the accuracy of the whole system.
[0003] Another conventional technique is provided in US5586086A which discloses a method for estimating the direction and range of a firearm event. The method described in this prior art is capable of providing range as well as direction of a firearm event only if the projectile from the firearm is supersonic and a shockwave is produced along with muzzle blast. In case of a subsonic projectile, the method described is capable of giving only the direction of firearm event. The disadvantages of the technique provided in US5586086A is that it is capable of providing range as well as direction of a firearm event only if the projectile from the firearm is supersonic In case of a subsonic projectile, the method described is not capable of providing the range of firearm event.
[0004] There is still a need of an invention which solves the above defined problems and provides a system and method that are capable of detecting any impulse sound event and classifying the impulse sound as desired acoustic event and estimating the direction and range of the source of the impulse sound event.
SUMMARY
[0005] This summary is provided to introduce concepts related to an impulse sound detection and localization system and method thereof. This summary is neither intended to identify essential features of the present invention nor it is intended for use in determining or limiting the scope of the present invention.
[0006] For example, various embodiments herein may include one or more systems and methods for detecting and localizing impulse sound are provided. In one of the embodiments, an impulse sound detection and localization system includes a plurality of microphone array units, a plurality of processing units and a host unit. The microphone array units are configured to receive acoustic signals from a plurality of sources. Each microphone array unit is configured to convert the acoustic signals to electrical signals. Each processing unit is configured to process the electrical signals. The processing unit includes a digitizer unit, a detection unit, a classification unit, and a direction estimation unit. The digitizer unit is configured to convert the electrical signals to digital data. The detection unit is configured to detect impulse sound using the digital data. The classification unit is configured to classify the detected impulse sound. The direction estimation unit is configured to estimate direction of the classified sound. The host unit includes a range estimation unit. The host unit is configured to estimate geographical coordinates of the classified impulse sound.
In another embodiment, a method for detecting and localising impulse sound includes a step of receiving, by a plurality of microphone array units, acoustic signals from a plurality of sources. The method includes a step of converting, by a microphone array unit, the acoustic signals to electrical signals. The method includes a step of converting, by a digitizer unit, the electrical signals to digital data. The method includes a step of detecting, by a detection unit, impulse sound using the digital data. The method includes a step of classifying, by a classification unit, the detected impulse sound. The method includes a step of estimating, by a direction estimation unit, direction of the classified sound. The method includes a step of estimating, by a range estimation unit, geographical coordinates of the classified impulse sound.
BRIEF DESCRIPTION OF ACCOMPANYING DRAWINGS
[0007] The detailed description is described with reference to the accompanying figures. In the figures, the left-most digit(s) of a reference number identifies the figure in which the reference number first appears. The same numbers are used throughout the drawings to reference like features and modules.
[0008] Figure 1 illustrates a block diagram depicting an impulse sound detection and localization system, according to an implementation of the present invention.
[0009] Figure 2 illustrates a schematic diagram depicting an impulse sound detection and localization system, according to an embodiment of the present invention.
[0010] Figure 3 illustrates a schematic diagram depicting a tetrahedron microphone array, according to an embodiment of the present invention.
[0011] Figure 4 illustrates a flow chart depicting a method for detecting and localising impulse sound, according to an exemplary implementation of the present invention.
[0012] It should be appreciated by those skilled in the art that any block diagrams herein represent conceptual views of illustrative systems embodying the principles of the present invention. Similarly, it will be appreciated that any flowcharts, flow diagrams, and the like represent various processes which may be substantially represented in computer readable medium and so executed by a computer or processor, whether or not such computer or processor is explicitly shown.
DETAILED DESCRIPTION
[0013] In the following description, for the purpose of explanation, specific details are set forth in order to provide an understanding of the present invention. It will be apparent, however, to one skilled in the art that the present invention may be practiced without these details. One skilled in the art will recognize that embodiments of the present invention, some of which are described below, may be incorporated into a number of systems.
[0014] The various embodiments of the present invention provide an impulse sound detection and localization system and method thereof. Furthermore, connections between components and/or modules within the figures are not intended to be limited to direct connections. Rather, these components and modules may be modified, re-formatted or otherwise changed by intermediary components and modules.
[0015] References in the present invention to “one embodiment” or “an embodiment” mean that a particular feature, structure, characteristic, or function described in connection with the embodiment is included in at least one embodiment of the invention. The appearances of the phrase “in one embodiment” in various places in the specification are not necessarily all referring to the same embodiment.
[0016] In one of the embodiments, an impulse sound detection and localization system includes a plurality of microphone array units, a plurality of processing units, and a host unit. The microphone array units are configured to receive acoustic signals from a plurality of sources. Each microphone array unit is configured to convert the acoustic signals to electrical signals. The processing unit is configured to process the electrical signals. Each processing unit includes a digitizer unit, a detection unit, a classification unit, and a direction estimation unit. The digitizer unit is configured to convert the electrical signals to digital data. The detection unit is configured to detect impulse sound using the digital data. The classification unit is configured to classify the detected impulse sound. The direction estimation unit is configured to estimate direction of the classified sound. A host unit includes a range estimation unit. The range estimation unit is configured to estimate geographical coordinates of the classified impulse sound.
[0017] In another implementation, the host unit includes a display unit. The display unit is configured to display the estimated geographical coordinates.
[0018] In another implementation, impulse sound is classified as a desired acoustic event or a non-desired acoustic event.
[0019] In another implementation, the processing unit includes one or more sensors which are configured to sense data.
[0020] In another implementation, the direction estimation unit is configured to estimate direction of the classified sound based on the sensed data.
[0021] In another implementation, the processing unit includes a conditioning unit. The conditioning unit is configured to transmit power to the microphone array unit and amplify the electrical signals.
[0022] In another implementation, the sensors include a magnetic compass, a temperature sensor, and a positioning sensor. The magnetic compass is configured to estimate direction with respect to at least one cardinal direction. The temperature sensor is configured to sense temperature around the microphone array unit. The positioning sensor is configured to determine position of the microphone array unit.
[0023] In another implementation, the detection unit is configured to detect the impulse sound by checking changes in the energy level of the acoustic signals.
[0024] In another implementation, the classification unit is configured to classify the impulse sound by using pre-determined sounds. The classification unit is configured to identify one or more features of the detected impulse sound and pre-determined sounds and compare the identified features of the impulse sound with the identified features of the pre-determined sounds.
[0025] In another implementation, the direction estimation unit is configured to estimate the direction of the classified impulse sound in azimuth and elevation angles. The direction estimation unit is further configured to estimate the time delay between a plurality of microphones within an array.
[0026] In another implementation, the system includes a pedestal unit. The pedestal unit is configured to orient towards the desired impulse sound event.
[0027] In another implementation, the system includes a wired communication link or a wireless communication link between the microphone array units, the processing units, and the host unit. In another implementation, the direction estimation unit is configured to determine a path difference between the received signals at microphones. The direction estimation unit is further configured to identify time delay between the received signals at microphones by correlating the received signals with each other and computing a value which corresponds to the identified time delay between the received signals. The direction estimation unit is further configured to estimate direction of the classified sound based on the computed time delay between the received signals at microphones.
[0028] In another embodiment, a method for detecting and localising impulse sound includes a step of receiving, by a plurality of microphone array units, acoustic signals from a plurality of sources. The method includes a step of converting, by a microphone array unit, the acoustic signals to electrical signals. The method includes a step of converting, by a digitizer unit, the electrical signals to digital data. The method includes a step of detecting, by a detection unit, impulse sound using the digital data. The method includes a step of classifying, by a classification unit, the detected impulse sound. The method includes a step of estimating, by a direction estimation unit, direction of the classified sound.
[0029] In an embodiment of the present invention, a system and method for detecting and localizing an impulse sound event estimate the geographical coordinates of the source of the event and display it on a map in a host unit, and also having a plurality of subsystems deployed over an area are given.
[0030] Figure 1 illustrates a block diagram depicting an impulse sound detection and localization system (100), according to an implementation of the present invention.
[0031] An impulse sound detection and localization system (hereinafter referred to as “system”) (100) includes a plurality of microphone array units (102) and a plurality of processing units (104). In one embodiment, each microphone array unit (102) includes a respective processing unit (104). In an embodiment, the microphone array units (102) and processing units (104) are deployed to cover a large area and a plurality of microphones continuously monitor ambient acoustic signals from the area.
[0032] The plurality of microphone array units (102) are configured to receive acoustic signals from a plurality of sources in the area. Each microphone array unit (102) is configured to convert the received acoustic signals into electrical signals. In an embodiment, the sources can include any object in the area which generates acoustic signals.
[0033] The processing unit (104) is configured to cooperate with the microphone array unit (104) to receive the electrical signals. The processing unit (104) is further configured to process the received electrical signals. In an embodiment, the processing unit (104) includes a digitizer unit (106), a detection unit (108), a classification unit (110), and a direction estimation unit (112).
[0034] The digitizer unit (106) is configured to convert the electrical signals into digital data. The digitizer unit (106) is further configured to digitize the electrical signals. In an embodiment, the digitized samples are squared and averaged to estimate the energy of the acoustic signals and the estimated energy levels are used to detect any impulse sounds.
[0035] The detection unit (108) is configured to cooperate with the digitizer unit (106) to receive the digital data. The detection unit (108) is further configured to detect impulse sound using the digital data. In an embodiment, the detection unit (108) is configured to detect the impulse sound by checking changes in the energy level of the acoustic signals. In an embodiment, the detection unit (108) looks for any sudden changes in the acoustic energy levels for detecting impulse sound.
[0036] The classification unit (110) is configured to cooperate with the detection unit (108) to receive the detected impulse sound. The classification unit (110) is further configured to classify the detected impulse sound. In an embodiment, the classification unit (110) is configured to classify the impulse sound as a desired acoustic event or a non-desired acoustic event. In one embodiment, the classification unit (110) is configured to classify the impulse sound by using pre-determined sounds. The classification unit (110) is further configured to identify one or more features of the detected impulse sound and pre-determined sounds and compare the identified features of the impulse sound with the identified features of the pre-determined sounds. In an embodiment, once an impulse sound is detected, the classification unit (110) classifies the detected impulse sound as desired or non-desired impulse sound. The classification unit (110) is trained with pre-recorded acoustic events and the features i.e. time domain as well as frequency domain features, which are unique to the acoustic event of interest, are stored in the classification unit (110) before deployment of the system (100). The classification unit (110) extracts the same features of the detected impulse sounds and compare the extracted features with the stored features derived from the training set. The classification unit (110) declares the impulse sound as a desired acoustic event when the stored features matches with the extracted features of the acoustic event.
[0037] The direction estimation unit (112) is configured to cooperate with the classification unit (110) to receive the classified sound. The direction estimation unit (112) is further configured to estimate the direction of the classified sound. In an embodiment, the direction estimation unit (112) is configured to estimate the direction of the classified impulse sound in azimuth and elevation angles. The direction estimation unit (112) is further configured to estimate the time delay between the signals arriving at a plurality of microphones within an array. The direction estimation unit (112) is configured to determine a path difference between the received acoustic signals and identify time delay by correlating the received signals with each other and compute a value which corresponds to the identified time delay between the received signals for estimating the direction of the classified sound based on the computed value. In an embodiment, once the impulse sound is identified as the desired acoustic event, the direction estimation unit (112) estimates both azimuth and elevation angles of the impulse sound. The time delays between the signals arriving at the microphones in the array are unique to the direction in which the source is located. The direction estimation unit (112) first estimates the time delay between the signals received from the microphone and the same is used in further estimating the direction of the event. A pair of microphones is sufficient for estimating the direction in azimuth, but the pair cannot distinguish whether the sound is coming from the backside or front side of the array. For a pair of microphones, the range coverage is only 0o to 180o. To remove this front-back ambiguity another (for example, third) microphone is placed to form a plane. To estimate the elevation angle, other (for example, fourth) microphone is placed in a plane perpendicular to the plane formed by the first three microphones.
[0038] In an embodiment, the direction estimation unit (112) is configured to estimate time delay based Direction of Arrival (DOA). First, the direction estimation unit (112) estimates the time delay between the received impulse sound at microphones (122) in the microphone array unit (102) by using a correlation method, and then the time delays are used to estimate the DOA of the desired impulse sound.
[0039] For a linear array with two sensors, the path difference between the signals arriving at two sensors from a direction ? is given as:
where, d is the distance between the sensors and ? is the direction of arrival of the signal.
Time delay is obtained as
where, c is the speed of sound in air
Direction of the signal is obtained as
In the direction estimation, the time delay is estimated using the correlation method, where signals from at least two sensors are correlated with each other and the point where the correlation value is maximum corresponds to the time delay between the signals.
[0040] In another exemplary embodiment, to encounter the conventional problem of the linear array, the array of microphones is used in such a way that the accuracy of direction estimation will remain same irrespective of the direction of actual sound source. With a triangular arrangement of three microphones, the sound source is always falling in the high accuracy zone of any one of the three pairs of microphones. Once the direction of the acoustic event is estimated, the same along with self-geographical coordinates are sent to the host unit (116) through the communication link.
[0041] In an embodiment, the processing unit (104) includes a conditioning unit (114). The conditioning unit (114) is configured to provide power to the microphone array unit (102) and amplify the electrical signals. In one embodiment, the conditioning unit (114) is a signal conditioning unit.
[0042] In an embodiment, the system (100) includes a host unit (116). The host unit (116) is configured to cooperate with each processing unit (104). The host unit (116) further includes a range estimation unit (118) and a display unit (120).
[0043] The range estimation unit (118) is configured to estimate geographical coordinates of the classified impulse sound event.
[0044] The display unit (120) is configured to cooperate with the range estimation unit (118) to receive the estimated geographical coordinates. The display unit (120) is further configured to display the estimated geographical coordinates to a user.
[0045] In another embodiment, the system (100) includes one or more sensors (122). The sensors (122) are configured to sense data from a deployed area. In an embodiment, the sensors (122) include, but are not limited to, a magnetic compass, a temperature sensor, and a positioning sensor. The magnetic compass is configured to estimate direction with respect to at least one cardinal direction. The cardinal direction can be, but is not limited to, North. The temperature sensor is configured to sense temperature around the microphone array unit (102). The temperature sensor improves the accuracy of estimation of direction. The positioning sensor is configured to determine position of the microphone array unit (102). In an embodiment, the positioning sensor can be a Geographical Positioning System (GPS). The positioning sensor records its own position in latitude and longitude. In one embodiment, the direction estimation unit (112) is configured to estimate direction of the classified sound based on the sensed data. The sensed data includes, but is not limited to, data related to cardinal direction, temperature related data, and position related data.
[0046] In an embodiment, the system (100) includes a pedestal unit (124). The pedestal unit (124) is configured to orient towards the desired impulse sound event.
[0047] In an embodiment, the system (100) includes a wired communication link or a wireless communication link between the microphone array units (102), the processing units (104), and the host unit (116).In an embodiment, the various modules of the system (100) are communicatively coupled with each other by using a network (not shown in a figure). In an embodiment, the network includes wired and wireless networks having links. Examples of the wired networks include a Wide Area Network (WAN) or a Local Area Network (LAN), a client-server network, a peer-to-peer network, and so forth. Examples of the wireless networks include Wi-Fi, a Global System for Mobile communications (GSM) network, and a General Packet Radio Service (GPRS) network, an enhanced data GSM environment (EDGE) network, 802.5 communication networks, Code Division Multiple Access (CDMA) networks, or Bluetooth networks.
[0048] In another exemplary embodiment, to encounter the conventional problem of the linear array, the array of microphones is used in such a way that the accuracy of direction estimation will remain same irrespective of the direction of actual sound source. With a triangular arrangement of three microphones, the sound source always fall in the high accuracy zone of any one of the three pairs of microphones. Once the direction of the acoustic event is estimated, the same along with self-geographical coordinates are sent to the host unit (116) through the communication link. The host unit (116) receives the information from multiple subsystems deployed over the area and estimates the geographical coordinates of the impulse sound. The location of the acoustic event is displayed on a map and a command is given to a pedestal unit (124) to orient towards the direction of the acoustic event. The pedestal unit (124) can be used to mount a video camera and/or a night vision camera for further action.
[0049] Figure 2 illustrates a schematic diagram (200) depicting an impulse sound detection and localization system (100), according to an embodiment of the present invention.
[0050] Figure 2 illustrates a scenario where the system (100) is deployed. In Figure 2, the microphone array units (102) are deployed over an area and is connected to a processing unit (104).A tetrahedron array of microphones is mounted on a tripod deployed at the area, which consists of four number of microphones. In an embodiment, each microphone array unit (102) is connected with the respective processing unit (104).If any acoustic signals are received from any event, for example, impulse sound (202), the microphone array unit (102) is configured to convert the acoustic signals into electrical signals. The processing unit (104) then sends the estimated direction along with the geographical coordinates of the array to a host unit (116) where the geographical coordinates of the acoustic event are estimated by the range estimation unit (118) of the host unit (116) and a display unit (120) of the host unit (116) then displays on a map. The host unit (116) sends a command to the pedestal unit (124) for looking towards the acoustic event. In an exemplary embodiment, the system (100) is configured to estimate the geographical coordinates of the one or more sources of the event and display it on a map in the host unit (116).
[0051] In an exemplary embodiment, the magnetic compass gives the orientation of a primary microphone pair with respect to North through serial communication to the direction estimation unit (112) (as shown in Figure 1), which enables the direction estimation unit (112) to estimate the direction with respect to North irrespective of the orientation of the primary microphone pair. In one exemplary embodiment, the temperature sensor senses the ambient temperature around the microphone array units (102) and transmit the sensed data to the direction estimation unit (112) through serial communication, which improves the accuracy of the estimation of direction. In another exemplary embodiment, the positioning sensor records array position in latitude and longitude and transmits the position related data to the host unit (116) through the communication link.
[0052] Figure 3 illustrates a schematic diagram (300) depicting a tetrahedron microphone array, according to an embodiment of the present invention.
[0053] In an exemplary embodiment, Figure 3 illustrates an array of microphones (302) arranged in a tetrahedron fashion. Four microphones are placed at the vertices of a tetrahedron that forms six two element linear array of microphones which is required for estimating the direction in both azimuth and elevation.
[0054] Figure 4 illustrates a flow chart (400) depicting a method for detecting and localising impulse sound, according to an exemplary implementation of the present invention.
[0055] The flow chart starts at a step (402), receiving, by a plurality of microphone array units, acoustic signals from a plurality of sources. In an embodiment, a plurality of microphone array units (102) are configured to receive acoustic signals from a plurality of sources. At a step (404), converting, by a microphone array unit, the acoustic signals to electrical signals. In an embodiment, each microphone array unit (102) is configured to configured to convert the acoustic signals to electrical signals. At a step (406), converting, by a digitizer unit, the electrical signals to digital data. In an embodiment, a digitizer unit (106) is configured to convert the electrical signals to digital data. At a step (408), detecting, by a detection unit, impulse sound using the digital data. In an embodiment, a detection unit (108) is configured to detect impulse sound using the digital data. At a step (410), classifying, by a classification unit, the detected impulse sound. In an embodiment, a classification unit (110) is configured to classify the detected impulse sound. At a step (412), estimating, by a direction estimation unit, direction of the classified sound. In an embodiment, a direction estimation unit (112) is configured to estimate direction of the classified sound. At a step (414), estimating, by a range estimation unit, geographical coordinates of said classified impulse sound. In an embodiment, a range estimation unit (118) is configured to estimate geographical coordinates of the classified impulse sound.
[0056] It should be noted that the description merely illustrates the principles of the present invention. It will thus be appreciated that those skilled in the art will be able to devise various arrangements that, although not explicitly described herein, embody the principles of the present invention. Furthermore, all examples recited herein are principally intended expressly to be only for explanatory purposes to help the reader in understanding the principles of the invention and the concepts contributed by the inventor(s) to furthering the art and are to be construed as being without limitation to such specifically recited examples and conditions. Moreover, all statements herein reciting principles, aspects, and embodiments of the invention, as well as specific examples thereof, are intended to encompass equivalents thereof.
,CLAIMS:
1. An impulse sound detection and localization system (100), comprising:
a plurality of microphone array units (102) configured to receive acoustic signals from a plurality of sources, each microphone array unit (102) configured to convert said acoustic signals to electrical signals;
a plurality of processing units (104) configured to cooperate with said microphone array unit (102), each processing unit (104) configured to process said electrical signals, said processing unit (104) comprising:
a digitizer unit (106) configured to convert said electrical signals to digital data;
a detection unit (108) configured to cooperate with said digitizer unit (106), said detection unit (108) configured to detect impulse sound using said digital data;
a classification unit (110) configured to cooperate with said detection unit (108), said classification unit (110) configured to classify said detected impulse sound; and
a direction estimation unit (112) configured to cooperate with said classification unit (110), said direction estimation unit (112) configured to estimate direction of said classified sound; and
a host unit (116) configured to cooperate with said processing unit (104), said host unit (116) comprising:
a range estimation unit (118) configured to estimate geographical coordinates of said classified sound.
2. The system (100) as claimed in claim 1, wherein said host unit (116) comprising:
a display unit (120) configured to cooperate with said range estimation unit (118), said display unit (120) configured to display said estimated geographical coordinates.
3. The system (100) as claimed in claim 1, wherein said classified impulse sound is a desired acoustic event or a non-desired acoustic event.
4. The system (100) as claimed in claim 1, wherein said processing unit (104) comprising one or more sensors (122) configured to sense data.
5. The system (100) as claimed in claims 1 and 4, wherein said direction estimation unit (112) is configured to estimate direction of said classified sound based on said sensed data.
6. The system (100) as claimed in claim 1, wherein said processing unit (104) comprising a conditioning unit (114) configured to provide power to said microphone array unit and amplify said electrical signals.
7. The system (100) as claimed in claim 4, wherein said sensors (122) comprising:
a magnetic compass configured to estimate direction with respect to at least one cardinal direction;
a temperature sensor configured to sense temperature around said microphone array unit (102); and
a positioning sensor configured to determine position of said microphone array unit (102).
8. The system (100) as claimed in claim 1, wherein said detection unit (108) is configured to detect said impulse sound by checking changes in the energy level of said acoustic signals.
9. The system (100) as claimed in claim 1, wherein said classification unit (110) is configured to classify said impulse sound by using pre-determined sounds, said classification unit (110) is configured to:
identify one or more features of said detected impulse sound and pre-determined sounds; and
compare said identified features of said impulse sound with said identified features of said pre-determined sounds.
10. The system (100) as claimed in claim 1, wherein said direction estimation unit(112) is configured to estimate the direction of said classified impulse sound in azimuth and elevation angles, said direction estimation unit (112) is configured to estimate the time delay between a plurality of microphones within an array.
11. The system (100) as claimed in claims 1 and 3, comprising: a pedestal unit (124) configured to orient towards the desired impulse sound event.
12. The system (100) as claimed in claim 1 , wherein said system (100) comprising a wired communication link or a wireless communication link between said microphone array units(102), said processing units (104), and said host unit(116).
13. The system (100) as claimed in claim 1, wherein said direction estimation unit (112) is configured to:
determine a path difference between said received signals;
identify time delay by correlating said received signals with each other and compute a value which corresponds to the identified time delay between said received signals; and
estimate direction of said classified sound based on said computed value.
14. A method for detecting and localizing impulse sound, said method comprising:
receiving, by a plurality of microphone array units(102), acoustic signals from a plurality of sources;
converting, by a microphone array unit (102), said acoustic signals to electrical signals;
converting, by a digitizer unit (106), said electrical signals to digital data;
detecting, by a detection unit (108), impulse sound using said digital data;
classifying, by a classification unit (110), said detected impulse sound;
estimating, by a direction estimation unit (112), direction of said classified sound; and
estimating, by a range estimation unit (118), geographical coordinates of said classified impulse sound.
| # | Name | Date |
|---|---|---|
| 1 | 202041013299-PROVISIONAL SPECIFICATION [26-03-2020(online)].pdf | 2020-03-26 |
| 1 | 202041013299-Response to office action [01-11-2024(online)].pdf | 2024-11-01 |
| 2 | 202041013299-PROOF OF ALTERATION [04-10-2024(online)].pdf | 2024-10-04 |
| 2 | 202041013299-FORM 1 [26-03-2020(online)].pdf | 2020-03-26 |
| 3 | 202041013299-IntimationOfGrant19-09-2024.pdf | 2024-09-19 |
| 3 | 202041013299-DRAWINGS [26-03-2020(online)].pdf | 2020-03-26 |
| 4 | 202041013299-PatentCertificate19-09-2024.pdf | 2024-09-19 |
| 4 | 202041013299-FORM-26 [21-06-2020(online)].pdf | 2020-06-21 |
| 5 | 202041013299-FORM-26 [25-06-2020(online)].pdf | 2020-06-25 |
| 5 | 202041013299-CLAIMS [28-04-2023(online)].pdf | 2023-04-28 |
| 6 | 202041013299-FORM 3 [29-06-2020(online)].pdf | 2020-06-29 |
| 6 | 202041013299-COMPLETE SPECIFICATION [28-04-2023(online)].pdf | 2023-04-28 |
| 7 | 202041013299-ENDORSEMENT BY INVENTORS [29-06-2020(online)].pdf | 2020-06-29 |
| 7 | 202041013299-DRAWING [28-04-2023(online)].pdf | 2023-04-28 |
| 8 | 202041013299-FER_SER_REPLY [28-04-2023(online)].pdf | 2023-04-28 |
| 8 | 202041013299-DRAWING [29-06-2020(online)].pdf | 2020-06-29 |
| 9 | 202041013299-OTHERS [28-04-2023(online)].pdf | 2023-04-28 |
| 9 | 202041013299-CORRESPONDENCE-OTHERS [29-06-2020(online)].pdf | 2020-06-29 |
| 10 | 202041013299-COMPLETE SPECIFICATION [29-06-2020(online)].pdf | 2020-06-29 |
| 10 | 202041013299-FER.pdf | 2022-11-03 |
| 11 | 202041013299-FORM 18 [27-06-2022(online)].pdf | 2022-06-27 |
| 11 | 202041013299-Proof of Right [19-09-2020(online)].pdf | 2020-09-19 |
| 12 | 202041013299-Correspondence, Form-1_28-09-2020.pdf | 2020-09-28 |
| 13 | 202041013299-FORM 18 [27-06-2022(online)].pdf | 2022-06-27 |
| 13 | 202041013299-Proof of Right [19-09-2020(online)].pdf | 2020-09-19 |
| 14 | 202041013299-COMPLETE SPECIFICATION [29-06-2020(online)].pdf | 2020-06-29 |
| 14 | 202041013299-FER.pdf | 2022-11-03 |
| 15 | 202041013299-CORRESPONDENCE-OTHERS [29-06-2020(online)].pdf | 2020-06-29 |
| 15 | 202041013299-OTHERS [28-04-2023(online)].pdf | 2023-04-28 |
| 16 | 202041013299-DRAWING [29-06-2020(online)].pdf | 2020-06-29 |
| 16 | 202041013299-FER_SER_REPLY [28-04-2023(online)].pdf | 2023-04-28 |
| 17 | 202041013299-DRAWING [28-04-2023(online)].pdf | 2023-04-28 |
| 17 | 202041013299-ENDORSEMENT BY INVENTORS [29-06-2020(online)].pdf | 2020-06-29 |
| 18 | 202041013299-COMPLETE SPECIFICATION [28-04-2023(online)].pdf | 2023-04-28 |
| 18 | 202041013299-FORM 3 [29-06-2020(online)].pdf | 2020-06-29 |
| 19 | 202041013299-CLAIMS [28-04-2023(online)].pdf | 2023-04-28 |
| 19 | 202041013299-FORM-26 [25-06-2020(online)].pdf | 2020-06-25 |
| 20 | 202041013299-PatentCertificate19-09-2024.pdf | 2024-09-19 |
| 20 | 202041013299-FORM-26 [21-06-2020(online)].pdf | 2020-06-21 |
| 21 | 202041013299-IntimationOfGrant19-09-2024.pdf | 2024-09-19 |
| 21 | 202041013299-DRAWINGS [26-03-2020(online)].pdf | 2020-03-26 |
| 22 | 202041013299-PROOF OF ALTERATION [04-10-2024(online)].pdf | 2024-10-04 |
| 22 | 202041013299-FORM 1 [26-03-2020(online)].pdf | 2020-03-26 |
| 23 | 202041013299-Response to office action [01-11-2024(online)].pdf | 2024-11-01 |
| 23 | 202041013299-PROVISIONAL SPECIFICATION [26-03-2020(online)].pdf | 2020-03-26 |
| 1 | 202041013299SEARCHSTRATERGYE_03-11-2022.pdf |
| 2 | 202041013299AMENDEDSTRATERGYAE_15-12-2023.pdf |