Abstract: The present disclosure relates to a method and a system for determining number of targets from radar signals. The method includes receiving a set of radar signals reflected from one or more targets and generating a range-time plot based on the set of radar signals received and accumulated over a predetermined time interval. The method may include transforming the range-time plots into a test image. The method can include comparing the test image with one or more labelled images in a catalogue. Each of the labelled image may be assigned any one class from a plurality of classes indicative of the number of targets therein. The method may include classifying the test images into any one class from the set of classes indicative of the number of targets detected in the test image based on the comparison.
Description:TECHNICAL FIELD
[0001] The present disclosure relates generally to radar signal processing. In particular, the present disclosure relates to a method and a system for determining number of targets from radar signals.
BACKGROUND
[0002] Background description includes information that may be useful in understanding the present disclosure. It is not an admission that any of the information provided herein is prior art or relevant to the presently claimed disclosure, or that any publication specifically or implicitly referenced is prior art.
[0003] Radar technology has allowed for detection and tracking of targets in an area of interest. However, discerning multiple targets in the area of interest is a challenging task. Often, a range-time plot is used to determine the number of targets detected by the radar. This method generally involves comparing the range-time plot with an existing catalogue of range-plot images, each being labelled with the number of targets detected by the radar. Generally, a human operator is trained and tasked with manually or visually comparing the test images, and accordingly determine the number of targets therein.
[0004] However, such manual or visual determination may not be practical in many situations. For instance, the human operator may not be able to determine the number of targets detected by the radar in real time. Furthermore, manual or visual analysis is too time consuming for real-time applications, and may be susceptible to human error. The accuracy of the analysis may also depend on how well trained is the human operator.
[0005] Automated solutions have been proposed to address the problem. However, existing solutions do not determine the number of detected targets using range-time plots. Form of these plots also change with the nature of radar hardware, waveform, signal processing and nature of target as well as target movement. While range-doppler images can be used to detect and track multiple targets, the number of targets that can be detected are limited by maximum unambiguous range and doppler shift. Similarly in marine radar (Plan Position Indicator), while a large number of targets can be detected, the number of targets to be detected needs to be known beforehand. Further, Frequency Modulated Continuous Wave (FMCW) radar detects multiple targets without having any clear way to estimate the maximum number of targets to be detected. Multistatic radars, similarly, cannot predict the number of targets present at any specific instant. Orthogonal Frequency Division Multiplexing (OFDM) radars also have similar limitations. There is, therefore, a need for a method and a system to address the above-mentioned problems.
OBJECTS OF THE INVENTION
[0006] Some of the objects of the present disclosure, which at least one embodiment herein satisfies are listed herein below.
[0007] An object of the present disclosure is to provide a method and a system for determining number of targets from radar signals.
[0008] Another object of the present disclosure is to provide a method and a system for determining number of targets from radar signals in real time.
[0009] Another object of the present disclosure is to provide a method and a system for determining number of targets detected from range-time plots.
[0010] Yet another object of the present disclosure is to provide a method and a system for determining number of targets from radar signals with increased accuracy.
[0011] The other objects and advantages of the present disclosure will be apparent from the following description when read in conjunction with the accompanying drawings, which are incorporated for illustration of the preferred embodiments of the present disclosure and are not intended to limit the scope thereof.
SUMMARY
[0012] Aspects of the present disclosure relate generally to radar signal processing. In particular, the present disclosure relates to a method and a system for determining number of targets from radar signals.
[0013] In an aspect, a method for determining number of targets from radar signals may include receiving, by a radar device, a set of radar signals reflected from one or more targets and generating, by a processor, a range-time plot based on the set of radar signals received and accumulated over a predetermined time interval by the radar device. The method may then include transforming, by the processor, the range-time plots into a test image of a predefined set of dimension values and comparing, by the processor, the test image with a catalogue of one or more labelled images indicative of range-time plots, each of the one or more labelled image being assigned any one class from a set of classes indicative of the number of targets detected in said one or more labelled images. The method can also icnlude classifying, by the processor, the test images into any one class from the set of classes indicative of the number of targets detected in the test image based on the comparison.
[0014] In an embodiment, the test image has an NxN dimension, where N may be the predefined dimension value.
[0015] In an embodiment, comparing the test images may include using a pretrained neural network (NN), the pretrained NN being trained using batch normalization on a training dataset having the one or more labelled images for each of the classes in the set of classes.
[0016] In an embodiment, when the test image may be classified into one class from the set of classes with a probability value less than a predetermined probability value threshold, the method may include: storing, by the processor, the test image in the catalogue of labelled images; assigning, by the processor, a label indicative of a new class, and updating the set of classes with the new class to the stored image; and retraining, by the processor, a pretrained NN such that the pretrained NN learns to classify the test image into any one class from the updated set of classes.
[0017] In an aspect, a system for determining number of targets from radar signals, the system may include, one or more radar devices that receive a set of radar signals reflected from one or more targets. The system may also include a processor, and a memory coupled to the processor, wherein the memory may include processor-executable instructions, which on execution, cause the processor to: generate a range-time plot based on the set of radar signals received and accumulated over a predetermined time interval by the one or more radar devices; transform the range-time plots into a test image of a predefined set of dimension values; compare the test image with a catalogue of one or more labelled images indicative of range-time plots, each of the one or more labelled image being assigned any one class from a set of classes indicative of the number of targets detected in said one or more labelled image; and classify the test images into any one class from the set of classes indicative of the number of targets detected in the test image based on the comparison.
[0018] In an embodiment, the test image has an NxN dimension, where N may be the predefined dimension value.
[0019] In an embodiment, comparing the test images may include using a pretrained neural network (NN), the pretrained NN being trained using batch normalization on a training dataset having a plurality of labelled images for each of the classes in the set of classes.
[0020] In an embodiment, when the test image may be classified into one class from the set of classes with a probability value less than a predetermined probability value threshold, the processor may be configured to: store the test image in the catalogue of labelled images; assign a label indicative of a new class; and updating the set of classes with the new class to the stored image, and retrain a pretrained NN such that the pretrained NN learns to classify the test image into any one class from the updated set of classes.
[0021] In an embodiment, a non-transitory computer-readable medium for determining number of targets from radar signals may include processor-executable instructions that cause a processor to perform the steps of the method disclosed herein.
[0022] Various objects, features, aspects, and advantages of the inventive subject matter will become more apparent from the following detailed description of preferred embodiments, along with the accompanying drawing figures in which like numerals represent like components.
BRIEF DESCRIPTION OF THE DRAWINGS
[0023] The accompanying drawings are included to provide a further understanding of the present disclosure and are incorporated in and constitute a part of this specification. The drawings illustrate exemplary embodiments of the present disclosure and, together with the description, serve to explain the principles of the present disclosure.
[0024] FIG. 1 illustrates an exemplary architecture representation for implementing a proposed system for determining number of targets from radar signals, according to embodiments of the present disclosure.
[0025] FIG. 2 illustrates an exemplary block diagram representation of the proposed system, according to embodiments of the present disclosure.
[0026] FIGs. 3A-F illustrate exemplary representations of labelled images in a training dataset, according to embodiments of the present disclosure.
[0027] FIGs. 4A-D illustrates flow charts depicting methods for determining number of targets from radar signals and training models thereof, according to embodiments of the present disclosure.
[0028] FIG. 5 illustrates a hardware platform for the implementation of the proposed system, according to embodiments of the present disclosure.
[0029] The foregoing shall be more apparent from the following more detailed description of the invention.
DETAILED DESCRIPTION
[0030] In the following description, for the purposes of explanation, various specific details are set forth in order to provide a thorough understanding of the embodiments of the present disclosure. It will be apparent, however, that embodiments of the present disclosure may be practiced without these specific details. Several features described hereafter can each be used independently of one another or with any combination of other features. An individual feature may not address all of the problems discussed above or might address only some of the problems discussed above. Some of the problems discussed above might not be fully addressed by any of the features described herein.
[0031] The ensuing description provides exemplary embodiments only, and is not intended to limit the scope, applicability, or configuration of the disclosure. Rather, the ensuing description of the exemplary embodiments will provide those skilled in the art with an enabling description for implementing an exemplary embodiment. It should be understood that, various changes may be made in the function and arrangement of elements without departing from the spirit and scope of the invention as set forth.
[0032] Specific details are given in the following description to provide a thorough understanding of the embodiments. However, it will be understood by one of ordinary skill in the art that the embodiments may be practiced without these specific details. For example, circuits, systems, networks, processes, and other components may be shown as components in block diagram form in order not to obscure the embodiments in unnecessary detail. In other instances, well-known circuits, processes, algorithms, structures, and techniques may be shown without unnecessary detail in order to avoid obscuring the embodiments.
[0033] Also, it is noted that individual embodiments may be described as a process which is depicted as a flowchart, a flow diagram, a data flow diagram, a structure diagram, or a block diagram. Although a flowchart may describe the operations as a sequential process, many of the operations can be performed in parallel or concurrently. In addition, the order of the operations may be re-arranged. A process is terminated when its operations are completed but could have additional steps not included in a figure. A process may correspond to a method, a function, a procedure, a subroutine, a subprogram, etc. When a process corresponds to a function, its termination can correspond to a return of the function to the calling function or the main function.
[0034] The word “exemplary” and/or “demonstrative” is used herein to mean serving as an example, instance, or illustration. For the avoidance of doubt, the subject matter disclosed herein is not limited by such examples. In addition, any aspect or design described herein as “exemplary” and/or “demonstrative” is not necessarily to be construed as preferred or advantageous over other aspects or designs, nor is it meant to preclude equivalent exemplary structures and techniques known to those of ordinary skill in the art. Furthermore, to the extent that the terms “includes,” “has,” “contains,” and other similar words are used in either the detailed description or the claims, such terms are intended to be inclusive in a manner similar to the term “comprising” as an open transition word without precluding any additional or other elements.
[0035] As used herein, “connect,” “configure,” “couple,” and its cognate terms, such as “connects,” “connected,” “configured,” and “coupled” may include a physical connection (such as a wired/wireless connection), a logical connection (such as through logical gates of semiconducting device), other suitable connections, or a combination of such connections, as may be obvious to a skilled person.
[0036] As used herein, “send,” “transfer,” “transmit,” and their cognate terms like “sending,” “sent,” “transferring,” “transmitting,” “transferred,” “transmitted,” etc. include sending or transporting data or information from one unit or component to another unit or component, wherein the content may or may not be modified before or after sending, transferring, transmitting.
[0037] Reference throughout this specification to “one embodiment” or “an embodiment” or “an instance” or “one instance” means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the present invention. Thus, the appearances of the phrases “in one embodiment” or “in an embodiment” in various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner in one or more embodiments.
[0038] The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used herein, the singular forms “a,” “an,” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms “comprises” and/or “comprising,” when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof. As used herein, the term “and/or” includes any and all combinations of one or more of the associated listed products.
[0039] Various embodiments of the present disclosure relate generally to radar signal processing. In particular, the present disclosure relates to a method and a system for determining number of targets from radar signals.
[0040] In an aspect, a method for determining number of targets from radar signals may include: receiving, by a radar device, a set of radar signals reflected from one or more targets; generating, by a processor, a range-time plot based on the set of radar signals received and accumulated over a predetermined time interval by the radar device; transforming, by the processor, the range-time plots into a test image of a predefined set of dimension values; comparing, by the processor, the test image with a catalogue of one or more labelled images indicative of range-time plots, each of the one or more labelled image being assigned any one class from a set of classes indicative of the number of targets detected in said one or more labelled images; and classifying, by the processor, the test images into any one class from the set of classes indicative of the number of targets detected in the test image based on the comparison.
[0041] In an aspect, a system for determining number of targets from radar signals, the system may include, one or more radar devices that receive a set of radar signals reflected from one or more targets. The system may also include a processor, and a memory coupled to the processor, wherein the memory may include processor-executable instructions, which on execution, cause the processor to: generate a range-time plot based on the set of radar signals received and accumulated over a predetermined time interval by the one or more radar devices; transform the range-time plots into a test image of a predefined set of dimension values; compare the test image with a catalogue of one or more labelled images indicative of range-time plots, each of the one or more labelled image being assigned any one class from a set of classes indicative of the number of targets detected in said one or more labelled image; and classify the test images into any one class from the set of classes indicative of the number of targets detected in the test image based on the comparison.
[0042] In an aspect, a non-transitory computer-readable medium for determining number of targets from radar signals may include processor-executable instructions that cause a processor to perform the steps of the method disclosed herein.
[0043] FIG. 1 illustrates an exemplary architecture representation for implementing a proposed system for determining number of targets from radar signals, according to embodiments of the present disclosure. The architecture 100 may include a computing device 104, the system 110 and a radar device 140. The system 110 may be connected to the radar device 140 via a communication network 106.
[0044] In an embodiment, the system 110 may be implemented by way of a single device or a combination of multiple devices that may be operatively connected or networked together. For example, the system 110 may be implemented by way of a standalone device such as a centralized server, and the like, and may be communicatively coupled to a computing device device 104. In another example, the system 110 may be implemented in/associated with the computing device 104.
[0045] In an embodiment, the electronic device 108 and/or the computing devices 104 may be at least one of an electrical, an electronic, an electromechanical, and a computing device. The electronic device 108 and/or the computing devices 104 may include, but is not limited to, a mobile device, a smart-phone, a Personal Digital Assistant (PDA), a tablet computer, a phablet computer, a wearable computing device, a Virtual Reality/Augmented Reality (VR/AR) device, a laptop, a desktop, a server, and the like.
[0046] In an embodiment, the system 110 may be implemented in hardware or a suitable combination of hardware and software. Further, the system 110 may include a processor 112, an Input/Output (I/O) interface 114, and a memory 116. The I/O interface 114 of the system 110 may be used to receive user inputs from the computing devices 104 associated with the users 102. Further, the system 110 may also include other units such as a display unit, an input unit, an output unit, and the like, however the same are not shown in FIG. 1, for the purpose of clarity. Also, in FIG. 1, only a few units are shown, however, the system 110 or the network architecture 100 may include multiple such units or the system 110/network architecture 100 may include any such numbers of the units, obvious to a person skilled in the art or as required to implement the features of the present disclosure.
[0047] In an embodiment, the system 110 may be a hardware device including the processor 112 executing machine-readable program instructions to determine number of targets from radar signals in a computing environment. Execution of the machine-readable program instructions by the processor 112 may enable the proposed system 110 to determine number of targets from radar signals. The “hardware” may comprise a combination of discrete components, an integrated circuit, an application-specific integrated circuit, a field programmable gate array, a digital signal processor, or other suitable hardware. The “software” may comprise one or more objects, agents, threads, lines of code, subroutines, separate software applications, two or more lines of code, or other suitable software structures operating in one or more software applications or on one or more processors. The processor 112 may include, for example, but is not limited to, microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, state machines, logic circuits, and any devices that manipulate data or signals based on operational instructions, and the like. Among other capabilities, the processor 112 may fetch and execute computer-readable instructions in the memory 116 operationally coupled with the system 110 for performing tasks such as data processing, input/output processing, feature extraction, and/or any other functions. Any reference to a task in the present disclosure may refer to an operation being or that may be performed on data.
[0048] In an embodiment, the radar device 140 may include, but not be limited to, may include, but not be limited to, radar systems, radio frequency (RF) systems, communication systems, and the like. In an embodiment, the radar device 140 may be configured to transmit or broadcast, and receive one or more signals in at any one wavelength in the electromagnetic spectrum. In an example, the radar device 140 may be configured to transmit and receive radio signals. In an embodiment, the radar device 140 may be configured to detect one or more targets in a range or a vicinity of the radar device 140. In an example, the radar device 140 may be configured to detect one or more objects in a line of sight. In an embodiment, the radar device 140 may be configured to receive echoes or radio waves reflected from one or more targets (150-1, 150-2 and 150-3) (collectively referred to as targets 150) in the region of interest. In an embodiment, the radar device may receive the reflected radios waves and detect targets therein using Constant False Alarm Rate (CFAR) 𝑃fa.
[0049] In an embodiment, the radar device 140 may be configured to transmit the reflected radio signals to the system 110 as a first set of radio signals. In an embodiment, the one or more targets 150 may be air targets including, but not limited to, aircrafts, helicopters, unmanned arial vehicles, missiles, and the like, or ground targets including, but not limited to, automobiles, trains, ships, and the like. In an embodiment, each of the one or more targets 150 may include a transponder unit configured to transmit one or more response signals on receiving one or more request signals.
[0050] In an embodiment, the communication network 106 may be a wired communication network or a wireless communication network. The wireless communication network may be any wireless communication network capable of transferring data between entities of that network such as, but are not limited to, a Bluetooth, a Zigbee, a Near Field Communication (NFC), a Wireless-Fidelity (Wi-Fi) network, a Light Fidelity (Li-FI) network, a carrier network including a circuit-switched network, a packet switched network, a Public Switched Telephone Network (PSTN), a Content Delivery Network (CDN) network, an Internet, intranets, Local Area Networks (LANs), Wide Area Networks (WANs), mobile communication networks including a Second Generation (2G), a Third Generation (3G), a Fourth Generation (4G), a Fifth Generation (5G), a Sixth Generation (6G), a Long-Term Evolution (LTE) network, a New Radio (NR), a Narrow-Band (NB), an Internet of Things (IoT) network, a Global System for Mobile Communications (GSM) network and a Universal Mobile Telecommunications System (UMTS) network, combinations thereof, and the like.
[0051] In an aspect, the system 110 may generate a range-time plot based on the set of signals received and accumulated over a predetermined time interval by the radar device 140. In an embodiment, the generated range-time plots may be indicative of Short-Time Fourier Transform (STFT) signatures of different classes. In an embodiment, the range-time plots may be generated in real time.
[0052] In an embodiment, the system 110 may transform the range-time plots into a test image of a predefined set of dimension values. In an embodiment, the predefined set of dimension values may be NxN, where ‘N’ may be an integer value chosen as the first and second dimension of the test image. In such embodiments, the length dimension of the test image and the breadth dimension of the test image may be equal. In an embodiment, the test image may be provided as an input to a pretrained NN. In an embodiment, the pretrained NN may be a deep convolutional neural network (DCNN). In an embodiment, the pretrained NN may be trained using batch normalization on a training dataset having a one or more labelled images for each of the classes in a set of classes. In an example, the set classes may include, but not be limited to (1) one target, (2) two targets, (3) three targets, (4) four targets, (5) five targets., or (6) zero targets, as shown in FIGs. 3A-F. In an embodiment, the plurality of labelled images may be indicative of Short-Time Fourier Transform (STFT) signatures of different classes. Each of the classes may have a unique STFT signature corresponding to the number of targets detected by the radar signal.
[0053] In an example, the pretrained NN may be trained using keras library. In an example, the pretrained NN may be trained with a learning rate of 0.001, for number of iterations may be equal to the number of training samples or batch size. In an embodiment, the batch size may be 32. Further, the predefined set of dimensions for the test image may be 64 x 64 x3 (where ‘N’ is 64’ and 3 is number of channels associated with the image). In an example, the validation size may be 0.25. In an embodiment, the training dataset may have at least about 200 images for each class in the set of classes. However, it may be appreciated by those skilled in the art that the size of training dataset, batch size, number of iterations, predefined set of dimensions, validations and other parameters and hyperparameters of the NN may be suitably adapted to implement the system 110 or the method 200 based on the use requirements.
[0054] In an embodiment, the system 110 may be configured to compare the test image with a catalogue of one or more labelled images indicative of range-time plots, each of the one or more labelled image being assigned any one class from a set of classes indicative of the number of targets detected in said one or more labelled image. In an embodiment, the system 110 may compare the test image and the one or more labelled images using a comparison model. In an embodiment, the comparision model may be indicative of the pretrained NN. In an embodiment, the one or more labelled images in the catalogue may correspond to a subset of labelled images from the training dataset. In an embodiment, the subset of labelled images may be exemplar range-time plots for each of the classes in the set of classes, the test image being compared with the each of the subset of labelled images to determine the number of targets detected in said test image.
[0055] In an embodiment, the system 110 may classify the test images into any one class from the set of classes indicative of the number of targets detected in the test image based on the comparison. In an embodiment, the pretrained NN may assign a label or one class from the set of classes based on probability value evaluated for each class for the test image. In an embodiment, the probability values for each class may be evaluated based on similarity between the test image and each of the exemplar subset of labelled images. In an embodiment, when the test image is classified into one class from the set of classes with a probability value less than a predetermined probability value threshold, the system 110 may store, by the processor 112, the test image in the catalogue of labelled images. Further, the system 110 may assign a label indicative of a new class to the stored image, and update the set of classes with the new class. In an embodiment, the label may be assigned by a human, indicating the number of targets in the test image. In another embodiment, the system 110 may assign the stored test image a placeholder label, which may be correlated with the number of targets in said test image. The system 110 may then retrain the pretrained NN such that the pretrained NN learns to classify the test image into any one class from the updated set of classes. In such emboidments, the task of determining the number of targets detected in the range-time plots may be performed as a classification task.
[0056] In an embodiment, a non-transitory computer-readable medium for determining number of targets from radar signals may include processor-executable instructions that cause a processor to perform the steps of the method disclosed herein.
[0057] FIG. 2 illustrates an exemplary block diagram representation of the proposed system 110, according to embodiments of the present disclosure.
[0058] In an embodiment, data 120 may include radar signal data 121, image data 122, catalogue data 123, comparison model data 124, threshold data 125, and other data 128. In an embodiment, the data 120 may be stored in the memory 116 in the form of various data structures. Additionally, the data 120 may be organized using data models, such as relational or hierarchical data models. The other data 128 may store data, including temporary data and temporary files, generated by modules 130 for performing the various functions of the system 110.
[0059] In an embodiment, the modules 130 may include a receiving module 131, a generating module 132, a transforming module 133, a comparing module 134, a classifying module 135 and other modules 138.
[0060] In an embodiment, the data 120 stored in the memory 116 may be processed by the modules 130 of the system 110. The modules 130 may be stored within the memory 116. In an example, the modules 130, communicatively coupled to the processor 112 configured in the system 110, may also be present outside the memory 116, as shown in FIG. 2, and implemented as hardware. As used herein, the term modules may refer to an Application-Specific Integrated Circuit (ASIC), an electronic circuit, a processor (shared, dedicated, or group) and memory that execute one or more software or firmware programs, a combinational logic circuit, and/or other suitable components that provide the described functionality.
[0061] In an embodiment, the receiving module 131 may receive a set of radar signals reflected from one or more targets received by the radar device 140. In an embodiment, the set of signals may be stored as radar signal data 121. In an embodiment, the radar device 140 may transmit the set of radar signals received as a set of data packets to the receiving module 131.
[0062] In an embodiment, the generating module 132 may generate a range-time plot based on the set of radar signals received and accumulated over a predetermined time interval by the radar device 140. In an embodiment, the transforming module 133 may transform the range-time plots into a test image of a predefined set of dimension values. In an embodiment, the test image may be stored as image data 122.
[0063] In an embodiment, the comparing module 134 may compare the test image with one or more images in the catalogue of one or more labelled images indicative of range-time plots, each of the one or more labelled image being assigned any one class from a set of classes indicative of the number of targets detected in said one or more labelled image. In an embodiment, comparing the test images comprises using a pretrained neural network (NN), the pretrained NN being trained using batch normalization. In an embodiment, the pretrained NN may be trained on a training dataset having the one or more of labelled images for each of the classes in the set of classes. In an embodiment, the one or more labelled images may be stored as catalogue data 123. In an embodiment, the pretrained NN may be stored as comparison model data 124.
[0064] In an embodiment, the classifying module 135 may classify the test images into any one class from the set of classes indicative of the number of targets detected in the test image. In an embodiment, the classifying model 135 may classify the test image into any one class from the set of classes based on the probability value provided for said test image for each class in the set of classes. In an embodiment, the classifying module 135 may classify the test image into the class in the set of classes that may be provided with the highest probability value. In an embodiment, the classifying model 135 may compare the probability value of the chosen class with a predetermined probability threshold value. In an embodiment, when the probability value of the chosen class is less than the predetermined probability threshold, the test image may be stored in the catalogue of labelled images. In such embodiments, the test image may be assigned a label indicative of a new class, and the set of classes may be updated with the new class to the stored image. Further, the pretrained NN may be retrained such that the pretrained NN learns to classify the test image into any one class from the updated set of classes.
[0065] FIGs. 3A-F illustrate exemplary representations of of labelled images in a training dataset, according to embodiments of the present disclosure. FIG. 3A shows a range-time representation of radar signals received by the radar device 140 when no targets are present in the area of interest. FIG. 3B, 3C, 3D, 3E and 3F range-time representation of radar signals received by the radar device 140 when 1, 2, 3, 4, and 5 targets are present in the area of interest respectively. In an example, each of the range-time plots may be converted into one or more labelled images and stored as the training data for the pretrained NN. In an embodiment, the system 110 may be configured to determine the number of targets in the region of interest based on the test image derived from the range-time plot of the radar signals received by the radar device 140. The system 110 may determine the number of targets based on the similarity of the test image with the labelled images in the training dataset. The similarity may be inferred by providing the test image as input to the pretrained NN.
[0066] FIGs. 4A-D illustrates flow charts depicting methods 200, 300, and 400 for determining number of targets from radar signals and training models thereof, according to embodiments of the present disclosure.
[0067] In an embodiment, the method 200 allows for determination of number of targets detected in the range-time plot.
[0068] At step 202, the method 200 includes receiving, by a radar device such as the radar device 140 of FIGs. 1 and 2, a set of radar signals reflected from one or more targets.
[0069] At step 204, the method 200 includes generating, by a processor such as the processor 112 of FIGs. 1 and 2, a range-time plot based on the set of radar signals received and accumulated over a predetermined time interval by the radar device.
[0070] At step 206, the method 200 includes transforming, by the processor, the range-time plots into a pixel image of a predefined set of dimension values.
[0071] At step 208, the method 200 includes comparing, by the processor, the test image with a catalogue of one or more labelled images indicative of range-time plots, each of the one or more labelled image being assigned any one class from a set of classes indicative of the number of targets detected in said one or more labelled image.
[0072] At step 210, the method 200 includes classifying, by the processor, the test images into any one class from the set of classes indicative of the number of targets detected in the test image.
[0073] In an embodiment, the method 300 may relate to training of a pretrained NN. At step 301, the one or more labelled images may be collected and stored in the training dataset. In an embodiment, the labels assigned to the test images may correspond to any one class from the set of classes, where each class in said set of classes is indicative of the number of targets detected in the radio signals received by the radar device. At step 302, the one or more labelled images may be pre-processed for training the NN. In an embodiment, pre-processing the labelled images may include converting said one or more labelled images into an NxN dimension, where ‘N’ is a predefined dimension value. At step 303, the one or more labelled images may be provided as input to the NN for training. In an embodiment, the NN may be a classification model that classifies the input into one class from a set of classes. In an embodiment, the NN may be indicative of a deep convolutional neural network model. At step 304, the trained model is stored as the pretrained NN for inference.
[0074] In an embodiment, the method 400 may relate to testing of the pretrained NN. At step 401, the test image generated from the set of signals may be pre-processed for providing said image as input to the pretrained NN for inference. In an embodiment, pre-processing the generated image may include steps associated with pre-processing of the one or more labelled images at step 302. At step 402, the pre-processed image is inputted to the pretrained NN as input. At step 403, the pretrained NN processes the inputted image and, at step 404, assigns a label to said image. In an embodiment, the label may be indicative of one class from the set of classes to which the pretrained NN assigns the highest probability value.
[0075] In an embodiment, the method 500 relates to retraining of the pretrained NN. In an embodiment, the pretrained NN may not be able to assign a class to the test image, as indicated at step 501. In such embodiments, the assigned class may have a probability value less than a predefined probability value threshold. At step 502, the model may fail to recognize or assign a class to the input image associated with one or more targets 150, when the assigned class has a probability value less than a predefined threshold. At step 503, the test image may be saved in the training dataset, thereby updating the training dataset. In an embodiment, data including, but not limited to, type of target, number of targets in the test image, and the like, may be collected and associated with the test image stored in the training dataset. At step 504, the pretrained NN may be retrained with the updated training dataset. In an embodiment, the retrained NN may be able to recognize and assign classes to previously unrecognizable classes, thereby increasing the accuracy of the pretrained NN.
[0076] The order in which the methods 200, 300, 400 and 500 is described is not intended to be construed as a limitation, and any number of the described method blocks may be combined or otherwise performed in any order to implement the methods 200, 300, 400 and 500 or an alternate method. Additionally, individual blocks may be deleted from the methods 200, 300, 400 and 500 without departing from the scope of the present disclosure described herein. Furthermore, the methods 200, 300, 400 and 500 may be implemented in any suitable hardware, software, firmware, or a combination thereof that exists in the related art or that is later developed. The methods 200, 300, 400 and 500 describes, without limitation, the implementation of the system 110. A person of skill in the art will understand that the methods 200, 300, 400 and 500 may be modified appropriately for implementation in various manners without departing from the scope of the disclosure.
[0077] FIG. 5 illustrates a hardware platform 600 for the implementation of the proposed system 110, according to embodiments of the present disclosure. For the sake of brevity, the construction, and operational features of the system 110 which are explained in detail above are not explained in detail herein. Particularly, computing machines such as but not limited to internal/external server clusters, quantum computers, desktops, laptops, smartphones, tablets, and wearables which may be used to execute the system 110 or may include the structure of the hardware platform 600. As illustrated, the hardware platform 600 may include additional components not shown, and that some of the components described may be removed and/or modified. For example, a computer system with multiple Graphics Processing Units (GPUs) may be located on external-cloud platforms or internal corporate cloud computing clusters, or organizational computing resources, and the like.
[0078] The hardware platform 600 may be a computer system such as the system 110 that may be used with the embodiments described herein. The computer system may represent a computational platform that includes components that may be in a server or another computer system. The computer system may execute, by the processor 605 (e.g., a single or multiple processors) or other hardware processing circuit, the methods, functions, and other processes described herein. These methods, functions, and other processes may be embodied as machine-readable instructions stored on a computer-readable medium, which may be non-transitory, such as hardware storage devices (e.g., random access memory (RAM), read-only memory (ROM), erasable programmable ROM (EPROM), electrically erasable, programmable ROM (EEPROM), hard drives, and flash memory). The computer system may include the processor 605 that executes software instructions or code stored on a non-transitory computer-readable storage medium 610 to perform methods of the present disclosure. The software code includes, for example, instructions to gather data and documents and analyze documents. In an example, the modules 204, may be software codes or components performing these steps. For example, the modules may include a receiving module 131, a generating module 132, a transforming module 133, a comparing module 134, a classifying module 135 and other modules 138.
[0079] The instructions on the computer-readable storage medium 610 are read and stored the instructions in storage 615 or in random access memory (RAM). The storage 615 may provide a space for keeping static data where at least some instructions could be stored for later execution. The stored instructions may be further compiled to generate other representations of the instructions and dynamically stored in the RAM such as RAM 620. The processor 605 may read instructions from the RAM 620 and perform actions as instructed.
[0080] The computer system may further include the output device 625 to provide at least some of the results of the execution as output including, but not limited to, visual information to users, such as external agents. The output device 625 may include a display on computing devices and virtual reality glasses. For example, the display may be a mobile phone screen or a laptop screen, where a Graphical User Interface (GUI) and/or text may be presented as an output on the display screen. The computer system may further include an input device 630 to provide a user or another device with mechanisms for entering data and/or otherwise interacting with the computer system. The input device 630 may include, for example, a keyboard, a keypad, a mouse, or a touchscreen. Each of these output devices 625 and input device 630 may be joined by one or more additional peripherals. For example, the output device 625 may be used to display the results.
[0081] A network communicator 635 may be provided to connect the computer system to a network and in turn to other devices connected to the network including other clients, servers, data stores, and interfaces, for instance. A network communicator 635 may include, for example, a network adapter such as a Local Access Network (LAN) adapter or a wireless adapter. The computer system may include a data sources interface 640 to access the data source 645. The data source 645 may be an information resource. As an example, a database of exceptions and rules may be provided as the data source 645. Moreover, knowledge repositories and curated data may be other examples of the data source 645.
[0082] The present disclosure, therefore, solves the need for a method and a system for determining number of targets from radar signals.
[0083] While the foregoing describes various embodiments of the invention, other and further embodiments of the invention may be devised without departing from the basic scope thereof. The scope of the invention is determined by the claims that follow. The invention is not limited to the described embodiments, versions, or examples, which are included to enable a person having ordinary skill in the art to make and use the invention when combined with information and knowledge available to the person having ordinary skill in the art.
ADVANTAGES OF THE INVENTION
[0084] The present disclosure provides a method and a system for determining number of targets from radar signals.
[0085] The present disclosure provides a method and a system for determining number of targets from radar signals in real time.
[0086] The present disclosure provides a method and a system for determining number of targets detected in range-time plots.
[0087] The present disclosure provides a method and a system for determining number of targets from radar signals with increased accuracy. , Claims:1. A method for determining number of targets from radar signals, comprising:
receiving, by a radar device, a set of radar signals reflected from one or more targets;
generating, by a processor, a range-time plot based on the set of radar signals received and accumulated over a predetermined time interval by the radar device;
transforming, by the processor, the range-time plots into a test image of a predefined set of dimension values;
comparing, by the processor, the test image with a catalogue of one or more labelled images indicative of range-time plots, each of the one or more labelled image being assigned any one class from a set of classes indicative of the number of targets detected in said one or more labelled images; and
classifying, by the processor, the test images into any one class from the set of classes indicative of the number of targets detected in the test image based on the comparison.
2. The method as claimed in claim 1, wherein the test image has an NxN dimension, where N is the predefined dimension value.
3. The method as claimed in claim 1, wherein comparing the test images comprises using a pretrained neural network (NN), the pretrained NN being trained using batch normalization on a training dataset having the one or more labelled images for each of the classes in the set of classes.
4. The method as claimed in claim 1, wherein when the test image is classified into one class from the set of classes with a probability value less than a predetermined probability value threshold, the method comprises:
storing, by the processor, the test image in the catalogue of labelled images;
assigning, by the processor, a label indicative of a new class to the stored image, and updating the set of classes with the new class; and
retraining, by the processor, a pretrained NN such that the pretrained NN learns to classify the test image into any one class from the updated set of classes.
5. A system for determining number of targets from radar signals, the system comprises:
one or more radar devices that receive a set of radar signals reflected from one or more targets;
a processor; and
a memory coupled to the processor, wherein the memory comprises processor-executable instructions, which on execution, cause the processor to:
generate a range-time plot based on the set of radar signals received and accumulated over a predetermined time interval by the one or more radar devices;
transform the range-time plots into a test image of a predefined set of dimension values;
compare the test image with a catalogue of one or more labelled images indicative of range-time plots, each of the one or more labelled image being assigned any one class from a set of classes indicative of the number of targets detected in said one or more labelled image; and
classify the test images into any one class from the set of classes indicative of the number of targets detected in the test image based on the comparison.
6. The system as claimed in claim 5, the test image has an NxN dimension, where N is the predefined dimension value.
7. The system as claimed in claim 5, wherein comparing the test images comprises using a pretrained neural network (NN), the pretrained NN being trained using batch normalization on a training dataset having the one or more labelled images for each of the classes in the set of classes.
8. The system as claimed in claim 5, wherein when the test image is classified into one class from the set of classes with a probability value less than a predetermined probability value threshold, the processor is configured to:
store the test image in the catalogue of labelled images;
assign a label indicative of a new class to the stored image, and updating the set of classes with the new class; and
retrain a pretrained NN such that the pretrained NN learns to classify the test image into any one class from the updated set of classes.
9. A non-transitory computer-readable medium for determining number of targets from radar signals comprising processor-executable instructions that cause a processor to:
generate a range-time plot based on the set of radar signals received and accumulated over a predetermined time interval by one or more radar devices;
transform the range-time plots into a test image of a predefined set of dimension values;
compare the test image with a catalogue of one or more labelled images indicative of range-time plots, each of the one or more labelled image being assigned any one class from a set of classes indicative of the number of targets detected in said one or more labelled image; and
classify the test images into any one class from the set of classes indicative of the number of targets detected in the test image based on the comparison.
| # | Name | Date |
|---|---|---|
| 1 | 202341028032-STATEMENT OF UNDERTAKING (FORM 3) [17-04-2023(online)].pdf | 2023-04-17 |
| 2 | 202341028032-POWER OF AUTHORITY [17-04-2023(online)].pdf | 2023-04-17 |
| 3 | 202341028032-FORM 1 [17-04-2023(online)].pdf | 2023-04-17 |
| 4 | 202341028032-DRAWINGS [17-04-2023(online)].pdf | 2023-04-17 |
| 5 | 202341028032-DECLARATION OF INVENTORSHIP (FORM 5) [17-04-2023(online)].pdf | 2023-04-17 |
| 6 | 202341028032-COMPLETE SPECIFICATION [17-04-2023(online)].pdf | 2023-04-17 |
| 7 | 202341028032-ENDORSEMENT BY INVENTORS [02-05-2023(online)].pdf | 2023-05-02 |
| 8 | 202341028032-Proof of Right [27-05-2023(online)].pdf | 2023-05-27 |
| 9 | 202341028032-POA [07-10-2024(online)].pdf | 2024-10-07 |
| 10 | 202341028032-FORM 13 [07-10-2024(online)].pdf | 2024-10-07 |
| 11 | 202341028032-AMENDED DOCUMENTS [07-10-2024(online)].pdf | 2024-10-07 |
| 12 | 202341028032-Response to office action [01-11-2024(online)].pdf | 2024-11-01 |