Sign In to Follow Application
View All Documents & Correspondence

System And Method For Multi Level Information Integration

Abstract: A multi-level information integration system (100) includes a tracking unit (102) that operates across three distinct level tracking modules for generating a real time Unambiguous situation picture (USP). The three distinct levels of tracking include a lowest-level tracking module (104), that gathers plot-level data from various sensors regarding target positions and a middle-level tracking module (106) that merges tracked data to form coherent tracks for selected targets. Further, the three distinct levels of tracking include a higher-level tracking module (108) to integrate data from both sensors and command and control (C2) systems, refining the track information further. The fusion of data across the three distinct levels enables the generation of the USP, offering a comprehensive view of the situation. Additionally, a display (110) is coupled to the tracking unit (102) to display the USP, empowering users with the insights necessary for informed and strategic decision-making.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
15 March 2024
Publication Number
38/2025
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
Parent Application

Applicants

Bharat Electronics Limited
Corporate Office, Outer Ring Road, Nagavara, Bangalore - 560045, Karnataka, India.

Inventors

1. PRATEEK DAYAL
Central Research Laboratory, Bharat Electronics Ltd, Sahibabad, Industrial Area Site IV, Ghaziabad - 201010, Uttar Pradesh, India.
2. SAURABH GAUTAM
Central Research Laboratory, Bharat Electronics Ltd, Sahibabad, Industrial Area Site IV, Ghaziabad - 201010, Uttar Pradesh, India.
3. SOMNATH BANERJEE
Central Research Laboratory, Bharat Electronics Ltd, Sahibabad, Industrial Area Site IV, Ghaziabad - 201010, Uttar Pradesh, India.
4. ANKITA KUMARI
Central Research Laboratory, Bharat Electronics Ltd, Sahibabad, Industrial Area Site IV, Ghaziabad - 201010, Uttar Pradesh, India.
5. POOJA GOSWAMI
Central Research Laboratory, Bharat Electronics Ltd, Sahibabad, Industrial Area Site IV, Ghaziabad - 201010, Uttar Pradesh, India.

Specification

Description:TECHNICAL FIELD
[0001] The present disclosure relates generally to information integration. In particular, the present disclosure relates to a system and method for multi-level information integration to generate a real time Unambiguous situation picture (USP).

BACKGROUND
[0002] Background description includes information that may be useful in understanding the present disclosure. It is not an admission that any of the information provided herein is prior art or relevant to the presently claimed disclosure, or that any publication specifically or implicitly referenced is prior art.
[0003] In today's increasingly interconnected and data-rich world, organizations and entities across diverse sectors rely on real-time information to drive critical decisions and actions. Whether in military operations, transportation management, emergency response, or industrial automation, the ability to obtain timely and reliable insights from disparate data sources is paramount to achieving operational effectiveness and efficiency.
[0004] However, the amalgamation of data from different tracking sources presents a multifaceted challenge. Each tracking source, be it radar, GPS, sonar, Command and control systems (C2 systems) or video surveillance, possesses its own set of characteristics, limitations, and biases. Moreover, the data generated by these sources may vary in terms of accuracy, precision, update rates, and coverage, further complicating the fusion process.
[0005] Traditionally, attempts at data fusion have been hindered by several obstacles, including data incompatibility, conflicting information, sensor errors, and the lack of robust fusion algorithms. Consequently, decision-makers are often confronted with fragmented or contradictory situational awareness, impeding their ability to formulate effective responses and strategies.
[0006] There is, therefore, a need to provide a simple and cost-effective system and method equipped with fusion algorithms, advanced signal processing techniques, and machine learning models to generate a real-time unambiguous situational picture characterized by clarity, accuracy, and reliability.

OBJECTS OF THE PRESENT DISCLOSURE
[0007] Some of the objects of the present disclosure, which at least one embodiment herein satisfies are as listed herein below.
[0008] An object of the present disclosure is to provide a system and a method that enables one to make comprehensive decisions based on the fused data.
[0009] Another object of the present disclosure is to provide a system and a method to facilitate more effective decision-making across various domains and operational scenarios.
[0010] Another object of the present disclosure is to provide a system and a method to integrate data from one or more tracking sources, overcoming challenges such as data incompatibility and sensor errors.
[0011] Another object of the present disclosure is to provide a system and a method to ensure compliance with regulatory standards and requirements governing data fusion and information security, safeguarding sensitive information, and maintaining data integrity.
[0012] Yet another object of the present disclosure is to provide a system and a method that offer cost-efficient solutions for organizations seeking to enhance their decision-making capabilities.

SUMMARY
[0013] This section is provided to introduce certain objects and aspects of the present disclosure in a simplified form that are further described below in the detailed description. This summary is not intended to identify the key features or the scope of the claimed subject matter.
[0014] Aspect of the present disclosure relates generally to information integration. In particular, the present disclosure relates to a system and method for multi-level information integration to generate a real time Unambiguous situation picture
[0015] According to an aspect of the present disclosure elaborates upon a system for multi-level information integration that includes a tracking unit that tracks targets at three distinct levels of tracking modules to generate a real time Unambiguous situation picture (USP). The three distinct levels of tracking modules comprise a lowest level tracking module, configured to receive the track data from a plurality of sensors to provide plot level information for positional attributes of the target, the positional attributes of the target pertain to geographic location, spatial location, speed, path, acceleration, orientation and any combination thereof. Further, the three distinct levels of tracking modules comprise a middle-level tracking module configured to receive updated track data from the first level tracking module and perform a plot-to-track fusion of the track data of the targets. Furthermore, the three distinct levels of tracking modules comprise a higher-level tracking module configured to receive the track data from both the plurality of sensors and command and control (C2) systems and perform track-to-track fusion of the track data of the targets. The three distinct levels of tracking modules facilitate the seamless fusion of the track data to generate the USP. The system comprises a display coupled with the tracking unit that displays the USP, offering a holistic view of the situation for informed and strategic decision-making.
[0016] In an aspect, the real time Unambiguous situation picture (USP) may be generated through kinematic and attribute fusion of track data of the targets. The targets may be moving targets selected from air, surface, sub-surface, space targets, and any combination thereof. The kinematic and attribute fusion of track data pertains to positional attributes, identity-related information, object classification, sensor-specific features, and any combination thereof.
[0017] In an aspect, the system may include a database that stores sensor and system characteristics, and initialization configurations crucial for the operation of the tracking unit, facilitating track conversion, fusion, gating, and association processes. Further, the system may include a repository that stores information pertaining to sensor tracks, system tracks, fused track data, and non-kinematic track attributes. Furthermore, the system may include a tactical module operatively coupled to the tracking unit, the tactical modules utilize the generated USP for diverse functions pertaining to mission planning, threat assessment, identification, decision-making, mission control, and guidance, enhancing overall operational efficiency and strategic effectiveness.
[0018] In an aspect, the system delivers USP leveraging essential positional attributes of the target under low bandwidth conditions, facilitating efficient data transmission over diverse communication mediums pertaining to fiber, radio, and similar channels.
[0019] In an aspect, the system may be configured to integrate any sensor or system seamlessly, scalable to accommodate numerous sensors/systems based on processing power, employs selective fusion for enhanced tracking quality. Further, the system may be configured to enhance the processing speed of the system while maintaining consistent tracking quality and integrate Bayesian/Statistical based sensors/systems without requiring knowledge of internal implementation or specific tracking mechanisms.
[0020] In an aspect, the lowest-level tracking module corresponds to the sensor level tracking module, the middle-level tracking module corresponds to the system level tracking module and the higher-level tracking module corresponds to the system of systems level module.
[0021] In an aspect, the lowest-level tracking module may be configured to convert the track data into internal processing formats by a sensor data processor, upon receiving the track data and compensating the track data associated with bias errors, by a bias compensation module. Further, the lowest-level tracking module may be configured to convert the track data to a common reference frame (CRF), by a coordinate conversion module and subject, by a gating module, co-ordinate converted plot data to reduce a number of candidate track data for current plot data. Furthermore, the lowest-level tracking module may be configured to associate by an association module, the current plot data with the reduced number of candidate track data; update, by an IMM filter, associated plot data by performing plot-to-track fusion to provide filtered kinematic values and covariance matrices for the targets; and maintain, by a local track maintenance module, track management of the filtered data by maintaining track numbers, association history and other associated non-kinematic track attributes.
[0022] In an aspect, the middle-level tracking module may be configured to initiate a coarse gating to minimize the number of candidate track data, upon receiving the track data with calculated covariance from the lowest-level tracking module and performing ellipsoidal gating to select a smaller set of relevant candidate track data. Further, the middle-level processor may be configured to identify a system track data requiring an update based on the relevant candidate track data through an association logic and update identified system track data using the Interacting Multiple Model (IMM) by performing plot to track fusion; and maintain a reciprocal mapping between sensor and system levels for providing fused USP for analysis and decision-making purposes.
[0023] In an aspect, the higher-level tracking module may be configured to transform the received track data to the common reference frame, ensuring uniformity in the positional attributes of the track data of the targets and performing gating to identify the candidate track data consistent with a new track data. Further, the higher-level processor may be configured to determine suitable candidate track data for updating with the new track data by employing Chi-square testing in the association and generating the USP, composed of fused data from both the plurality of sensors and Command and Control (C2) systems by performing the track-to-track fusion.
[0024] In another aspect, the present disclosure elaborates upon a method for multi-level information integration. The method comprises steps of; (a) a lowest-level tracking module comprising steps of (i) receiving the track data from a plurality of sensors to provide plot-level information for positional attributes of the target, the positional attributes of the target pertaining to geographic location, spatial location, speed, path, acceleration, orientation, and any combination thereof; (b) a middle-level tracking module comprising to steps of: (i) receiving updated track data from the lowest-level tracking module; (ii) performing plot to track fusion of the track data of the targets; and (c) a higher-level tracking module comprising to steps of: (i) receiving the track data from both the plurality of sensors and command and control (C2) systems, and (ii) performing track-to-track fusion of the track data of the targets.
[0025] Various objects, features, aspects, and advantages of the inventive subject matter will become more apparent from the following detailed description of preferred embodiments, along with the accompanying drawing figures in which like numerals represent like components.

BRIEF DESCRIPTION OF DRAWINGS
[0026] The accompanying drawings are included to provide a further understanding of the present disclosure, and are incorporated in, and constitute a part of this specification. The drawings illustrate exemplary embodiments of the present disclosure, and together with the description, serve to explain the principles of the present disclosure.
[0027] In the figures, similar components, and/or features may have the same reference label. Further, various components of the same type may be distinguished by following the reference label with a second label that distinguishes among the similar components. If only the first reference label is used in the specification, the description is applicable to any one of the similar components having the same first reference label irrespective of the second reference label.
[0028] FIG. 1 illustrates an exemplary block diagram of the proposed system multi-level information integration to generate a real time Unambiguous situation picture (USP), in accordance with embodiments of the present disclosure.
[0029] FIG. 2 illustrates an exemplary block diagram of the proposed three distinct level tracking, in accordance with embodiments of the present disclosure.
[0030] FIG. 3 illustrates an exemplary block diagram of the proposed sensor level tracking, in accordance with embodiments of the present disclosure.
[0031] FIG. 4 illustrates an exemplary method for multi-level information integration to generate a real time Unambiguous situation picture (USP), in accordance with an embodiment of the present disclosure.
[0032] The foregoing shall be more apparent from the following more detailed description of the disclosure.

DETAILED DESCRIPTION
[0033] In the following description, for the purposes of explanation, various specific details are set forth in order to provide a thorough understanding of embodiments of the present disclosure. It will be apparent, however, that embodiments of the present disclosure may be practiced without these specific details. Several features described hereafter can each be used independently of one another or with any combination of other features. An individual feature may not address all of the problems discussed above or might address only some of the problems discussed above. Some of the problems discussed above might not be fully addressed by any of the features described herein.
[0034] FIG. 1 illustrates an exemplary block diagram of the proposed system multi-level information integration to generate a real time Unambiguous situation picture (USP), in accordance with embodiments of the present disclosure.
[0035] Referring to FIG. 1, the system 100 for multi-level information integration for generating a real time Unambiguous situation picture (USP) is disclosed. The system 100 can be configured to receive information from multiple sensors, systems, and systems of systems in command and control (C2) to generate a real-time real time Unambiguous situation picture (USP) (interchangeably referred to as unambiguous situation picture UPS). The system 100 can be configured with a kinematic fusion that can fuse tracked data by sensors and the systems that enable the system 100 to generate a real-time, precise, and real time Unambiguous situation picture (USP) within any command-and-control environment.
[0036] In an embodiment, the system 100 can be configured to manage sensor and system data at three distinct levels. The system 100 can include a tracking unit 102 (also referred to as tracking block 102) that can track targets at three distinct levels of tracking modules to generate a real time Unambiguous situation picture (USP). The three distinct levels of tracking modules may include a lowest-level tracking module 104 (interchangeably referred to as sensor-level tracking), a middle-level tracking module 106 (interchangeably referred to as system-level tracking), and a highest-level tracking 108 (interchangeably referred to as the system of system-level tracking).
[0037] In an embodiment, the lowest-level tracking module 104 corresponds to the sensor level tracking module, the middle-level tracking module 106 corresponds to the system level tracking module and the higher-level tracking module 108 corresponds to the system of systems level module. Each level of the tracking module can be configured with a processor 202 that can be configured to initiate the tracking module at the three distinct levels to track data.
[0038] In an embodiment, the three distinct levels of tracking modules can facilitate a seamless fusion of the track data to generate the USP. The system 100 can include a display 110 coupled to the tracking unit displays the USP, offering a holistic view of the situation for informed and strategic decision-making. The display 110 can enable users to analyze data with precision and make informed decisions swiftly. Whether it's monitoring critical metrics or strategizing for optimal outcomes, the display unit serves as a powerful tool for enhancing situational awareness and driving effective action.
[0039] The tracked data are selected from sensor data, C2 system data and any combination thereof of the target. Within the environments such as sensors and systems such as Radars, Sonars, Electro-optic systems, Automatic Identification Systems (AIS), ADSB, and fire control systems, among others, constitute essential components. These sensors and systems can be strategically positioned either in close proximity or dispersed across vast geographical areas. The position of sensors and systems can ensure comprehensive coverage of an area, these sensors are often deployed redundantly and designed with overlapping coverage to maximize target detection capabilities. However, to achieve real-time reliability and accuracy, specialized data transformation operations are essential. The AIS system data enhances track stability by mitigating track jumping and spoofing effects if available. The described approach streamlines the integration of sensor data, ensuring robust track generation for comprehensive surveillance.
[0040] In an embodiment, the real time Unambiguous situation picture (USP) can be generated through kinematic and attribute fusion of track data of the targets, the targets are moving targets selected from air, surface, sub-surface, space targets, and any combination thereof. The kinematic and attribute fusion of track data pertains to positional attributes, identity-related information, object classification, sensor-specific features, and any combination thereof.
[0041] In an embodiment, the system 100 can include a database 112, a repository 114, a tactical module 116, and other systems 118. The database 112 can store sensor and system characteristics, and essential initialization configurations necessary for the seamless operation of the tracking unit. The stored data plays a pivotal role in enabling track conversion, fusion, gating, and association processes by providing the necessary data and parameters required for these operations.
[0042] In an embodiment, the repository 114 can store a wide array of information crucial to the tracking system. This includes but is not limited to sensor tracks, system tracks, fused track data, and various non-kinematic track attributes, wherein the non-kinematic track attributes pertain to identifiers of targets, category of the targets and any combination thereof.
[0043] In an exemplary embodiment, the repository 114 (also referred to as track repository 114) can be configured to maintain tracked data at the sensors and system level processing. The track repository 114 can be configured to store all the information about the sensor tracks, system tracks, and fused track data i.e., unique situation pictures generated by the system. It also stores other non-kinematic track attributes reported by the sensors and systems for enhancement of USP.
[0044] In an embodiment, the tactical module 116 can operatively couple to the tracking unit 102. The tactical module 116 can be configured to leverage the generated USP for a wide range of functions crucial to mission success. These include but are not limited to mission planning, threat assessment, target identification, decision-making, mission control, and guidance. By utilizing the USP, the tactical modules can significantly enhance overall operational efficiency and strategic effectiveness, enabling smoother coordination and more informed actions in dynamic operational environments.
[0045] In an embodiment, the system 100 can be configured to deliver USP by leveraging essential positional attributes of the target under low bandwidth conditions, facilitating efficient data transmission over diverse communication mediums pertaining to fiber, radio, and similar channels.
[0046] In an embodiment, the system 100 can be configured to integrate any sensor or system seamlessly, scalable to accommodate numerous sensors/systems based on processing power, and employs selective fusion for enhanced tracking quality and enhanced processing speed of the system while maintaining consistent tracking quality; and integrate Bayesian/Statistical based sensors/systems without requiring knowledge of internal implementation or specific tracking mechanisms.
[0047] In an embodiment, the system 100 can be configured with one or more plugs or one or more integration units for accommodating one or more sensors or systems in the existing system without any loss of information and quality of tracked data. Further, the other system 118 can be communicatively coupled with the system 100 to provide output in proprietary formats to the tactical modules 116 and configured with the capability to provide information to other systems 118 in standard international formats.
[0048] Thus, the present disclosure addresses the challenge of efficiently amalgamating data received from diverse sensors, systems, and system-of-systems within the Command and Control (C2) domain to generate a real-time, dependable Unified Situational Picture (USP). The system includes Model Independence, allowing seamless integration of information from disparate sensors and systems, irrespective of variations in inputs/outputs, characteristics, technology implementations, and deployment scales. The method permits the utilization of any Bayesian class or statistical algorithm for target tracking, accommodating air, ground, space, and sub-surface targets. Furthermore, the invention offers an Expandable architecture facilitating the plug-and-play integration of new sensors or subsystems into the existing system without necessitating extensive modifications. The configurable nature of the system allows users to dynamically tailor the output based on specific requirements, adjusting the contribution of certain sensors or systems in real-time. Additionally, the system demonstrates flexibility by fusing information from sensors/systems operating in a global reference frame with undisclosed or restricted deployment information due to security considerations.
[0049] FIG. 2 illustrates an exemplary block diagram of the proposed three distinct level tracking, in accordance with embodiments of the present disclosure.
[0050] The three distinct levels of tracking modules may include the lowest-level tracking module 104, the middle-level tracking module 106, and the highest-level tracking 108.

SENSOR LEVEL TRACKING
[0051] Referring to FIGs. 2 the lowest-level tracking module 104 can be configured to receive the track data from a plurality of sensors to provide plot-level information for the positional attributes of the target, the positional attributes of the target pertaining to geographic location, spatial location, speed, path, acceleration, orientation, and any combination thereof. The plurality of sensors can be Radars, Sonars, EO, etc., that can work independently and provide positional information of targets in their respective frames of reference. However, integrating data from the plurality of sensors poses challenges such as bias compensation, achieving a common frame of reference, and estimating inherent errors. These challenges are effectively addressed by sensor-level tracking, facilitating the fusion of sensor data into a unified reference frame along with error estimation. Furthermore, legacy sensors often lack covariance-related information for their tracks, which is also resolved by the sensor level tracking module so that these can be combined in the later stages to generate a common situation picture.
[0052] In an embodiment, the lowest-level tracking module 104 is depicted in FIG. 3 can be configured to convert the track data into internal processing formats by a sensor data processor 310. The internal processing formats include international standard formats and other proprietary formats to internal storage formats that can be used for further tracking. Upon receiving the track data vary from sensor to sensor that can be removed by a bias compensation module 320 to mitigate systematic errors inherent in sensor readings of the tracked data. The compensated data can be co-ordinated and converted into a common reference frame by a co-ordinate and converted 330 to ensure complete coverage, sensors can be co-located or geographically distributed and dispersed over the entire area. The sensors capture target information in their own frame of reference. To combine pictures generated by different sensors, the sensor data is co-ordinate converted to a common frame of reference. While performing co-ordinate conversion, WGS-84 compliance and curvature of the earth can also be taken into consideration. Further, the sensor-level tracking can be configured to subject a gating module 340, and local level association 350 to co-ordinate converted plot data to reduce a number of candidate track data for current plot data. The co-ordinate converted data undergo gating 340 and a local level association 350, wherein a standard ellipsoidal gate based on Gaussian probability distribution reduces the number of candidate tracks, and a Global nearest neighbor method determines track eligibility for updating.
[0053] Subsequently, the plotted data can be utilized for updating associated local sensor tracks using an Interacting Multiple Model (IMM) filter 360, which combines Kalman Filter (KF) and Extended Kalman Filter (EKF) for linear and coordinated turn models. The IMM filter yields filtered kinematic values and covariance matrices for all targets. Track maintenance logic 370 can be configured to track numbers of allocation, association history, and other non-kinematic parameters, maintaining a bi-directional mapping between sensor level and system level tracks. The track maintenance logic 370 can be responsible for logical decision-making to enhance the stability of targets thereby decreasing track loss probability and increasing the performance of the system.
[0054] In some examples, the sensor can have different standard deviations and environmental errors. The placement of sensors can be co-located or geographically distributed as per user requirements. Apart from the placement, these sensors can be placed on the ground, underwater, or airborne. The sensor can be static i.e. placed at a fixed place or location and can be mobile i.e. sensor placed on a mobile platform. The location of the mobile sensor changes as the platform changes its location.

SYSTEM-LEVEL TRACKING
[0055] Referring to FIG. 2, the middle-level tracking module 106 (interchangeably referred to as system-level tracking) depicted in FIG. 2 can be configured to receive updated track data from the lowest-level tracking module 104 to perform a plot-to-track fusion of the track data of the targets in order to generate the UPS. The middle-level tracking module 106 can be configured with the processor 202 that can be configured to initiate a coarse gating to minimize the number of candidate track data, upon receiving the track data with calculated covariance from the lowest-level tracking module. Further, perform ellipsoidal gating to select a smaller set of relevant candidate track data to identify a system track data requiring an update based on the relevant candidate track data through an association logic. Furthermore, the middle-level processor can be configured to update identified system track data using the Interacting Multiple Model (IMM) by performing a plot to track fusion and maintain a reciprocal mapping between sensor and system levels for providing fused USP for analysis and decision-making purposes.
[0056] In an embodiment, the middle-level processor can be configured to process a sophisticated association logic, integrating past updating history, current association checks employing chi-square analysis, and threshold history logic. The middle-level processor can be configured to determine the system track most suitable for updates. The updating of the system track can facilitate using IMM filter plot data derived from the sensor track.
[0057] In an embodiment, the track maintenance block can serve the critical function of updating and storing track kinematics, contribution information, and other pertinent parameters related to system tracks. It establishes a bidirectional track mapping between sensor and system levels tracking, as well as between the system and the system-of-systems level. Additionally, the middle-level processor can be configured for furnishing the fused USP in standardized formats to external entities for analysis and decision-making purposes.
[0058] In some examples, the system 100 can configured to accept all the data tracked associated errors as their outputs by the sensors for generation of USP.

SYSTEM OF SYSTEMS-LEVEL TRACKING
[0059] Referring to FIG. 2, in order to generate a complete USP, the higher-level tracking module 108 (interchangeably referred to as a system of systems-level tracking 108) can be configured to receive the track data from both the plurality of sensors and command and control (C2) systems and perform track-to-track fusion of the track data of the targets. The higher-level tracking module 108 configured with the processor 202 can be configured to ensure uniformity in the positional attributes of the track data of the targets to transform the received track data to the common reference frame. The positional attributes include global reference frame (geodetic or geocentric co-ordinates) and other standard international formats, the data (position, course, speed, accuracies etc.) to convert into the common reference frame. The higher-level processor can be configured to perform gating to identify the candidate track data consistent with a new track data and determine suitable candidate track data for updating with the new track data by employing Chi-square testing in the association. Further, the higher-level processor can be configured to generate the USP, composed of fused data from both the plurality of sensors and command and control (C2) systems by performing the track-to-track fusion. The culmination of this process yields the complete USP, including fused data from diverse sensors and systems. This real-time, dependable USP can be indispensable for tactical decision-making within any command and control (C2) system.
[0060] In some examples, the system of systems-level tracking 108 can be configured to receive track data information from sensors and trackers that provide filtered track information along with associated errors. Such tracks can also considered as system tracks data. Given the diverse sources of positional data from various systems, it's imperative to standardize this information into a common reference frame for effective analysis. While data from the system level tracks a unified reference frame, data from other C2 systems requires transformation for compatibility.
[0061] The data fusion process involves enhancing the value of tracked data by integrating inputs from different systems, such as highly mobile anti-aircraft or anti-drone systems, each equipped with multiple sensors. Fusion mechanisms differ from standard sensor fusion due to the varied nature of data inputs. To amalgamate data from different systems, a gating process is employed to identify contender system tracks for the current input. The association process employs chi-square testing to select the most appropriate system track for updating with the current input track.
[0062] In scenarios where multiple sensors/systems update a track within a short timeframe, the resultant minor track displacement may be overshadowed by measurement errors. Continuous updates from less accurate sensors/systems can degrade the overall tracking performance. Therefore, when numerous systems contribute to a track, a decision is made to update the track primarily with data from the most accurate sensors, while utilizing less accurate ones solely for association and non-kinematic attribute updates.
[0063] If a highly accurate sensor loses coverage, the next sensor in the association list that is not currently updating the system is utilized for kinematic updates. To update the target with current data, a covariance intersection process is employed. This process involves a selected group of systems updating a track based on their measurements, weighted by their respective accuracies, which may vary dynamically based on the current covariance and tracking inaccuracy associated with that specific track.
[0064] The final output of the system 100 can be a complete USP which may be composed of fused data from different sensors/systems. This provides a real-time, reliable USP that is essential for tactical decision-making in any C2 system. It also reduces the overall computational load on the system and increases its quality of tracking.
[0065] FIG. 4 illustrates an exemplary method for multi-level information integration to generate a real time Unambiguous situation picture (USP), in accordance with an embodiment of the present disclosure.
[0066] Referring to FIG. 4, a method 400 for multi-level information integration to generate a unique situation picture (UPS). The method 400 mainly includes three distinct level tracking steps to generate UPS. The three distinct level tracking steps can include a lowest-level tracking module, a middle-level tracking module, and a higher-level tracking module. The lowest-level tracking module can include steps of;
[0067] As illustrated in block 402, the lowest-level tracking module can be configured to receive the track data from a plurality of sensors to provide plot-level information for positional attributes of the target, the positional attributes of the target pertaining to geographic location, spatial location, speed, path, acceleration, orientation, and any combination thereof.
[0068] In an embodiment, the middle-level tracking module can include the step of: As illustrated in block 404, receiving updated track data from the lowest-level tracking module. As illustrated in block 406, performing a plot to track the fusion of the track data of the targets.
[0069] In an embodiment, the higher-level tracking module can include the step of: As illustrated in block 408, receiving the track data from both the plurality of sensors and command and control (C2) systems. As illustrated in block 410, performing track-to-track fusion of the track data of the targets.
[0070] In an exemplary embodiment, the tracking block 102 can be configured to target surface, sub-surface, air, and space for tracking data at three distinct levels via sensor level tracking, system level tracking, and system of systems tracking. The tracking block can be configured to receive inputs from sensors as well as systems and performs the task of sensor and system data fusion to generate a real-time, reliable USP.
[0071] In some examples, the command and control (C2) systems can be systems that can be used to manage surveillance and reconnaissance operations. such systems include Anti-aircraft gun systems, fire able multi-sensor tracking systems, etc. The system 100 accepts track data information from the C2 systems along with their reported covariance. The data can be utilized in a system of systems tracking for real-time USP generation.
[0072] In an embodiment, the proposed disclosure offers a comprehensive system 100 and method 400 for making informed decisions by leveraging fused data across various domains and operational scenarios. The system 100 facilitates effective decision-making by integrating data from multiple tracking sources, addressing challenges such as data incompatibility and sensor errors. Moreover, the system 100 ensures compliance with regulatory standards, safeguarding sensitive information and maintaining data integrity. Additionally, the system 100 and method 400 provide cost-efficient solutions for organizations looking to enhance their decision-making capabilities.
[0073] While the foregoing describes various embodiments of the invention, from the basic scope thereof. The scope of the invention is determined by the claims that follow. The invention is not limited to the described embodiments, versions, or examples, which are comprised to enable a person having ordinary skill in the art to make and use the invention when combined with information and knowledge available to the person having ordinary skill in the art.

ADVANTAGES OF THE INVENTION
[0074] The present disclosure is to provide a system and a method that enables one to make comprehensive decisions based on the fused data.
[0075] The present disclosure is to provide a system and a method to facilitate more effective decision-making across various domains and operational scenarios.
[0076] The present disclosure is to provide a system and a method to integrate data from one or more tracking sources, overcoming challenges such as data incompatibility and sensor errors.
[0077] The present disclosure is to provide a system and a method to ensure compliance with regulatory standards and requirements governing data fusion and information security, safeguarding sensitive information, and maintaining data integrity.
[0078] The present disclosure is to provide a system and a method that offer cost-efficient solutions for organizations seeking to enhance their decision-making capabilities.
, Claims:1. A system (100) for multi-level information integration, comprising:
a tracking unit (102) that tracks targets at three distinct levels of tracking modules to generate a real time Unambiguous situation picture (USP), the three distinct levels of tracking modules comprising:
a lowest-level tracking module (104) configured to:
receive the track data from a plurality of sensors to provide plot-level information for positional attributes of the targets, the positional attributes of the targets pertain to geographic location, spatial location, speed, path, acceleration, orientation, and any combination thereof;
a middle-level tracking module (106) configured to:
receive updated track data from the lowest-level tracking module (104);
perform plot to track fusion of the track data of the targets; and
a higher-level tracking module (108) configured to
receive the track data from both the plurality of sensors and command and control (C2) systems;
perform track-to-track fusion of the track data of the targets;
wherein the three distinct levels of tracking modules facilitate seamless fusion of the track data to generate the USP; and
a display (110) coupled to the tracking unit displays the USP, offering a holistic view of situation for informed and strategic decision-making.

2. The system (100) as claimed in claim 1, wherein the real time Unambiguous situation picture (USP) is generated through kinematic and attribute fusion of track data of the targets, the targets are moving targets selected from air, surface, sub-surface, space targets, and any combination thereof, wherein the kinematic and attribute fusion of track data pertain to positional attributes, identity-related information, object classification, sensor-specific feature and any combination thereof.

3. The system (100) as claimed in claim 1, wherein the system (100) comprises:
a database (112) that stores sensor and system characteristics, and initialization configurations crucial for operation of the tracking unit, facilitating track conversion, fusion, gating, and association processes;
a repository (114) that stores information pertaining to sensor tracks, system tracks, fused track data, and non-kinematic track attributes; and
a tactical module (116) operatively coupled to the tracking unit (102), the tactical modules utilize the generated USP for diverse functions pertaining to mission planning, threat assessment, identification, decision-making, mission control, and guidance, enhancing overall operational efficiency and strategic effectiveness.

4. The system (100) as claimed in claim 1, wherein the system (100) delivers USP leveraging essential positional attributes of the targets under low bandwidth conditions, facilitating efficient data transmission over diverse communication mediums pertaining to fiber, radio, and similar channels.

5. The system (100) as claimed in claim 1, wherein the system (100) is configured to:
integrate any sensor or system seamlessly, scalable to accommodate numerous sensors/systems based on processing power, and employs selective fusion for enhanced tracking quality;
enhance processing speed of the system while maintaining consistent tracking quality; and
integrate Bayesian/Statistical based sensors/systems without requiring knowledge of internal implementation or specific tracking mechanisms.

6. The system (100) as claimed in claim 1, wherein the lowest-level tracking module (104) corresponds to a sensor level tracking module, the middle-level tracking module (106) corresponds to a system level tracking module and the higher-level tracking module (108) corresponds to a system of systems level module.

7. The system (100) as claimed in claim 1, wherein the lowest-level tracking module (104) configured to:
convert, by a sensor data processor (310), upon receiving the track data, the track data into internal processing formats;
compensate, by a bias compensation module (320), bias errors associated with the track data;
convert, by a coordinate conversion module (330), the track data to a common reference frame (CRF);
subject, by a gating module (340, 350), co-ordinate converted plot data to reduce a number of candidate track data for current plot data;
associate, by an association module, the current plot data with the reduced number of candidate track data;
update, by an IMM filter (360), associated plot data by performing plot to track fusion to provide filtered kinematic values and covariance matrices for the targets; and
maintain, by a local track maintenance module (370), track management of the filtered data by maintaining track numbers, association history and other associated non-kinematic track attributes.

8. The system (100) as claimed in claim 1, wherein the middle-level tracking module (106) configured to:
initiate, upon receiving the track data with calculated covariance from the lowest-level tracking module (104), a coarse gating to minimize the number of candidate track data;
perform ellipsoidal gating, to select a smaller set of relevant candidate track data;
identify a system track data requiring an update based on the relevant candidate track data through an association logic;
update identified system track data using the Interacting Multiple Model (IMM) by performing plot to track fusion; and
maintain a reciprocal mapping between sensor and system levels for providing fused USP for analysis and decision-making purposes.

9. The system (100) as claimed in claim 1, wherein the higher-level tracking module (108) configured to:
transform the received track data to the common reference frame, ensuring uniformity in the positional attributes of the track data of the targets;
perform gating to identify the candidate track data consistent with a new track data;
determine suitable candidate track data for updating with the new track data by employing Chi-square testing in the association; and
generate the USP, composed of fused data from both the plurality of sensors and Command and Control (C2) systems by performing the track-to-track fusion.

10. A method (400) for multi-level information integration, the method (400) comprising:
receiving, at a lowest-level tracking module, the track data from a plurality of sensors to provide plot-level information for positional attributes of the targets, the positional attributes of the targets pertaining to geographic location, spatial location, speed, path, acceleration, orientation, and any combination thereof;
receiving, at a middle-level tracking module (106), updated track data from the lowest-level tracking module (104);
performing, at the middle-level tracking module (106), plot to track fusion of the track data of the targets;

receiving, at a higher-level tracking module, the track data from both the plurality of sensors and command and control (C2) systems; and
performing, at the higher-level tracking module, track-to-track fusion of the track data of the targets.

Documents

Application Documents

# Name Date
1 202441019301-STATEMENT OF UNDERTAKING (FORM 3) [15-03-2024(online)].pdf 2024-03-15
2 202441019301-POWER OF AUTHORITY [15-03-2024(online)].pdf 2024-03-15
3 202441019301-FORM 1 [15-03-2024(online)].pdf 2024-03-15
4 202441019301-DRAWINGS [15-03-2024(online)].pdf 2024-03-15
5 202441019301-DECLARATION OF INVENTORSHIP (FORM 5) [15-03-2024(online)].pdf 2024-03-15
6 202441019301-COMPLETE SPECIFICATION [15-03-2024(online)].pdf 2024-03-15
7 202441019301-Proof of Right [03-09-2024(online)].pdf 2024-09-03
8 202441019301-POA [04-10-2024(online)].pdf 2024-10-04
9 202441019301-FORM 13 [04-10-2024(online)].pdf 2024-10-04
10 202441019301-AMENDED DOCUMENTS [04-10-2024(online)].pdf 2024-10-04
11 202441019301-Response to office action [01-11-2024(online)].pdf 2024-11-01