Abstract: ABSTRACT METHOD AND SYSTEM FOR DATA MATURITY ASSESSMENT State of the art approaches used for data management have the disadvantages that there exists no maturity benchmark and score for comparative analysis, along with lack of variance level action plan and continuous feedback loop for incremental updates ]. The disclosure herein generally relates to a method and system for data maturity assessment. In this method, variances in input data collected for maturity assessment is associated with one of a plurality of variance categories, and each of the one or more variances is mapped to one or more associated cube slices from among a plurality of cube slices in a data variance model, wherein each of the plurality of cube slices in the data variance model represents data maturity in terms of data, continuous approach, and measure. Further, a data maturity score of the input data representing a measured data maturity of the input data is measured. [To be published with FIG. 2]
Description:FORM 2
THE PATENTS ACT, 1970
(39 of 1970)
&
THE PATENT RULES, 2003
COMPLETE SPECIFICATION
(See Section 10 and Rule 13)
Title of invention:
METHOD AND SYSTEM FOR DATA MATURITY ASSESSMENT
Applicant
Tata Consultancy Services Limited
A company Incorporated in India under the Companies Act, 1956
Having address:
Nirmal Building, 9th floor,
Nariman point, Mumbai 400021,
Maharashtra, India
Preamble to the description:
The following specification particularly describes the invention and the manner in which it is to be performed.
TECHNICAL FIELD
The disclosure herein generally relates to data processing, and, more particularly, to a method and system for data maturity assessment.
BACKGROUND
In any organization, data management and data safety are of prime importance. When it comes to data management, the grey area of who owns the data, who are custodians and who are the users, creates different perspective to key process indicators (KPIs) to analyze quality improvement. A holistic new era measure can help on unifying the measurable KPIs.
State of the art approaches used for data management have the disadvantage that no maturity benchmark and score for comparative analysis that exist. Another disadvantages is lack of variance level action plan and continuous feedback loop for incremental updates on variance ,flags and mapping slicers.
SUMMARY
Embodiments of the present disclosure present technological improvements as solutions to one or more of the above-mentioned technical problems recognized by the inventors in conventional systems. For example, in one embodiment, a method is provided. In this method, initially a selection on one or more variances relevant to a data maturity assessment is received as input data, via one or more hardware processors. Further, each of the one or more variances as being associated with one of a plurality of variance categories is determined, via the one or more hardware processors. Further, each of the one or more variances is mapped via the one or more hardware processors, to one or more associated cube slices from among a plurality of cube slices in a data variance model, wherein each of the plurality of cube slices in the data variance model represents data maturity in terms of data, continuous approach, and measure. Further, a data maturity score of the input data is measured based on the mapping of each of the one or more variances to one or more associated cube slices from among a plurality of cube slices, wherein the data maturity score represents a measured data maturity of the input data.
In another aspect, a system is provided. The system includes one or more hardware processors, a communication interface, and a memory storing a plurality of instructions. The plurality of instructions when executed, cause the one or more hardware processors to initially receive a selection on one or more variances relevant to a data maturity assessment, as input data. Further, each of the one or more variances as being associated with one of a plurality of variance categories is determined, via the one or more hardware processors. Further, each of the one or more variances is mapped via the one or more hardware processors, to one or more associated cube slices from among a plurality of cube slices in a data variance model, wherein each of the plurality of cube slices in the data variance model represents data maturity in terms of data, continuous approach, and measure. Further, a data maturity score of the input data is measured based on the mapping of each of the one or more variances to one or more associated cube slices from among a plurality of cube slices, wherein the data maturity score represents a measured data maturity of the input data.
In yet another aspect, a non-transitory computer readable medium is provided. The non-transitory computer readable medium includes a plurality of instructions, which when executed, cause one or more hardware processors to perform the following steps. Initially a selection on one or more variances relevant to a data maturity assessment is received as input data. Further, each of the one or more variances as being associated with one of a plurality of variance categories is determined. Further, each of the one or more variances is mapped to one or more associated cube slices from among a plurality of cube slices in a data variance model, wherein each of the plurality of cube slices in the data variance model represents data maturity in terms of data, continuous approach, and measure. Further, a data maturity score of the input data is measured based on the mapping of each of the one or more variances to one or more associated cube slices from among a plurality of cube slices, wherein the data maturity score represents a measured data maturity of the input data.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the invention, as claimed.
BRIEF DESCRIPTION OF THE DRAWINGS
The accompanying drawings, which are incorporated in and constitute a part of this disclosure, illustrate exemplary embodiments and, together with the description, serve to explain the disclosed principles:
FIG. 1 illustrates block diagram of an exemplary system for data maturity assessment, according to some embodiments of the present disclosure.
FIG. 2 is a flow diagram depicting steps involved in the process of performing the data maturity assessment, using the system of FIG. 1, according to some embodiments of the present disclosure.
FIG. 3 illustrates an example implementation of the system of FIG. 1, according to some embodiments of the present disclosure.
FIG. 4 is a flow diagram depicting details of the steps in the flow diagram of FIG. 2, executed by the system of FIG. 1, according to some embodiments of the present disclosure.
DETAILED DESCRIPTION OF EMBODIMENTS
Exemplary embodiments are described with reference to the accompanying drawings. In the figures, the left-most digit(s) of a reference number identifies the figure in which the reference number first appears. Wherever convenient, the same reference numbers are used throughout the drawings to refer to the same or like parts. While examples and features of disclosed principles are described herein, modifications, adaptations, and other implementations are possible without departing from the scope of the disclosed embodiments.
State of the art approaches used for data management have the disadvantage that no maturity benchmark and score for comparative analysis that exist. Another disadvantage is lack of variance level action plan and continuous feedback loop for incremental updates on variance, flags and mapping slicers.
In order to address these challenges, a system and method are provided for data maturity assessment. The data maturity is assessed in terms of variance (data variance). In the process of determining the data variance, initially a selection on one or more variances relevant to a data maturity assessment is received as input data, via one or more hardware processors. Further, each of the one or more variances as being associated with one of a plurality of variance categories is determined, via the one or more hardware processors. Further, each of the one or more variances is mapped via the one or more hardware processors, to one or more associated cube slices from among a plurality of cube slices in a data variance model, wherein each of the plurality of cube slices in the data variance model represents data maturity in terms of data, continuous approach, and measure. Further, a data maturity score of the input data is measured based on the mapping of each of the one or more variances to one or more associated cube slices from among a plurality of cube slices, wherein the data maturity score represents a measured data maturity of the input data.
Referring now to the drawings, and more particularly to FIG. 1 through FIG. 4, where similar reference characters denote corresponding features consistently throughout the figures, there are shown preferred embodiments and these embodiments are described in the context of the following exemplary system and/or method.
FIG. 1 illustrates block diagram of an exemplary system for data maturity assessment, according to some embodiments of the present disclosure.
The system 100 includes or is otherwise in communication with hardware processors 102, at least one memory such as a memory 104, an I/O interface 112. The hardware processors 102, memory 104, and the Input /Output (I/O) interface 112 may be coupled by a system bus such as a system bus 108 or a similar mechanism. In an embodiment, the hardware processors 102 can be one or more hardware processors.
The I/O interface 112 may include a variety of software and hardware interfaces, for example, a web interface, a graphical user interface, and the like. The I/O interface 112 may include a variety of software and hardware interfaces, for example, interfaces for peripheral device(s), such as a keyboard, a mouse, an external memory, a printer and the like. Further, the I/O interface 112 may enable the system 100 to communicate with other devices, such as web servers, and external databases.
The I/O interface 112 can facilitate multiple communications within a wide variety of networks and protocol types, including wired networks, for example, local area network (LAN), cable, etc., and wireless networks, such as Wireless LAN (WLAN), cellular, or satellite. For the purpose, the I/O interface 112 may include one or more ports for connecting several computing systems with one another or to another server computer. The I/O interface 112 may include one or more ports for connecting several devices to one another or to another server.
The one or more hardware processors 102 may be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, node machines, logic circuitries, and/or any devices that manipulate signals based on operational instructions. Among other capabilities, the one or more hardware processors 102 is configured to fetch and execute computer-readable instructions stored in the memory 104.
The memory 104 may include any computer-readable medium known in the art including, for example, volatile memory, such as static random-access memory (SRAM) and dynamic random-access memory (DRAM), and/or non-volatile memory, such as read only memory (ROM), erasable programmable ROM, flash memories, hard disks, optical disks, and magnetic tapes. In an embodiment, the memory 104 includes a plurality of modules 106.
The plurality of modules 106 include programs or coded instructions that supplement applications or functions performed by the system 100 for executing different steps involved in the process of switching between hardware accelerators for model training, being performed by the system 100. The plurality of modules 106, amongst other things, can include routines, programs, objects, components, and data structures, which performs particular tasks or implement particular abstract data types. The plurality of modules 106 may also be used as, signal processor(s), node machine(s), logic circuitries, and/or any other device or component that manipulates signals based on operational instructions. Further, the plurality of modules 106 can be used by hardware, by computer-readable instructions executed by the one or more hardware processors 102, or by a combination thereof. The plurality of modules 106 can include various sub-modules (not shown). The plurality of modules 106 may include computer-readable instructions that supplement applications or functions performed by the system 100 for the switching between hardware accelerators for model training.
The data repository (or repository) 110 may include a plurality of abstracted piece of code for refinement and data that is processed, received, or generated as a result of the execution of the plurality of modules in the module(s) 106.
Although the data repository 110 is shown internal to the system 100, it will be noted that, in alternate embodiments, the data repository 110 can also be implemented external to the system 100, where the data repository 110 may be stored within a database (repository 110) communicatively coupled to the system 100. The data contained within such external database may be periodically updated. For example, new data may be added into the database (not shown in FIG. 1) and/or existing data may be modified and/or non-useful data may be deleted from the database. In one example, the data may be stored in an external system, such as a Lightweight Directory Access Protocol (LDAP) directory and a Relational Database Management System (RDBMS). Functions of the components of the system 100 are now explained with reference to the steps in flow diagrams in FIG. 2, and FIG. 4, and with reference to the components depicted in the example implementation in FIG. 3. In an embodiment, in the example implementation of the system 100 as depicted in FIG. 3, the system 100 includes a variance segregator and labeler, a variance mapping engine, a plurality of post variance feed enhancers, and a slicer engine executed by the one or more hardware processors, as components of the system 100.
FIG. 2 is a flow diagram depicting steps involved in the process of performing the data maturity assessment, using the system of FIG. 1, according to some embodiments of the present disclosure.
In an embodiment, the system 100 comprises one or more data storage devices or the memory 104 operatively coupled to the processor(s) 102 and is configured to store instructions for execution of steps of the method 200 by the processor(s) or one or more hardware processors 102. The steps of the method 200 of the present disclosure will now be explained with reference to the components or blocks of the system 100 as depicted in FIG. 1 and the steps of flow diagram (method 200) as depicted in FIG. 2. Although process steps, method steps, techniques or the like may be described in a sequential order, such processes, methods, and techniques may be configured to work in alternate orders. In other words, any sequence or order of steps that may be described does not necessarily indicate a requirement that the steps to be performed in that order. The steps of processes described herein may be performed in any order practical. Further, some steps may be performed simultaneously.
At step 202 of the method 200, a selection on one or more variances relevant to a data maturity assessment is received as input data, via the one or more hardware processors 102. Some examples of data variance are a) Missing Allocation of Data Stewards, b) Data Literacy issue, and c) poor training or lack of data standards and Guidelines.
Further, at step 204 of the method 200, the one or more hardware processors 102 are configured by the instructions to determine each of the one or more variances as being associated with one of a plurality of variance categories. This may be done by the variance segregator and labeler and the variance mapping engine as depicted in the example implementation in FIG. 3. The variance categories maybe represented as A (1…n), B (1…n) and C (1…n). The maturity Score Application (MSA) of the system 100, as depicted in FIG. 3, has two Post variance feeds to enhance the variance mapping engine, and the slicer engine. One post slicer feed includes processing an acquired maturity measurement data to enhance the slicer Engine for acquired raw slicer data, while the other post variance feed includes processing acquired maturity measurement data to enhance the variance labeler and segregator for acquired raw variance data.
In an embodiment, each of the one or more variances is determined as being associated with one of the plurality of variance categories based on a pre-defined mapping of the one or more variances and the plurality of variance categories. Information on the mapping maybe stored in a data variance model used by the system 100, such that each variance is mapped to at least one mapping category. If the variance data is not
Further, at step 206 of the method 200, the system 100 maps each of the one or more variances to one or more associated cube slices from among a plurality of cube slices of a 4D-4C-4M cube in a data variance model, via the one or more hardware processors 102, wherein each of the plurality of cube slices in the data variance model represents data maturity in terms of data, continuous approach, and measure. As depicted in FIG. 4, it is possible that a variant maybe associated with more than one cubes. Upon identifying that one or more of the variants are associated with more than one cube, the system 100 accordingly maps the variant to all applicable slices of the cube.
Further, at step 208 of the method 200, a data maturity score of the input data is measured based on the mapping of each of the one or more variances to one or more associated cube slices from among a plurality of cube slices in the 4D-4C-4M cube which captures information on data variances, via the one or more hardware processors 102, wherein the data maturity score represents a measured data maturity of the input data. At this step, as depicted in FIG. 4, a coverage of a question block as well as a percentage of slice coverage are calculated, each with a specific weightage (50% each, for example, but maybe configured as needed), and a corresponding summation value is obtained as the data maturity score.
The data maturity score measured for a given data input maybe then provided to the user, via a suitable interface. For example, the data maturity score maybe displayed using a display interface i.e. a screen internally or externally associated with the system 100. In another example, the data maturity score may be sent to a device being used by the user, for example, a mobile phone, or a tablet pc, which is configured to be in communication with the system 100 via the one or more interfaces 112.
Details of the data variance model is given below.
Data variance model:
In the data variance model, variances are classified into the plurality of categories. For explanation purpose, number of categories is selected as three, and the three categories are represented as Process-A, People-B, and Technology-C, for which a total variance is calculated as:
Total Variant = ?_(i=1)^N¦?Prc-A? + ?_(i=1)^N¦?Ppl-B? + ?_(i=1)^N¦?Tec-C?
To assess the data maturity, each of the variances is assigned to one of the 64 Slices of a 4D-4C-4M cube, where D denotes Data, C denotes Continuous, and M denotes Measure, which are aspects of data management that are used to measure the data variance.
The data block 4 components (4D) are represented as ?_(i=1 )^4¦D i
The continuous block 4 components (4C) are represented as ?_(i=1 )^4¦C i
The Measure block 4 components (4M) are represented as ?_(i=1 )^4¦M i
64 Cube slices are represented by ?_(i=1 )^4¦D i ? ?_(i=1 )^4¦C i ? ?_(i=1 )^4¦M i as ?_(i=1 )^64¦S i
The slices are mapped to address the variants identified. One Slice can address 0...N Variant identified for the process.
For example,
S1 (D1C1M1) slice is used to address (Prc-A1, Ppl-B1 and Tec-C4) three variants with the associated definition
S2 (D1C1M2) slice is not mapped to any variant resolution at an instance when the same was considered
The 4Q Block ?_(i=1 )^4¦Q i are mapped to the ?_(i=1 )^64¦S i blocks which are Valid. For example,
Q1 is answered by five slicers (S1, S4, S9, S16, S27) so Q1 block is assigned Binary Value 1 when all slicers are mapped
Q1 (answered by five slicer (S1, S4, S9, S16, S27) ) would be ratio of 2/5 binary value 1 if only 2 slicers are mapped
Similarly, all Q2, Q3, or Q4 would have ratio of binary value based on validated slicer mapping
Based on the slice from the cube ?_(i=1 )^64¦S i answered the % coverage is calculated.
Coverage of data in the cube is based on data in all the slices, and is calculated as below:
Slice of cube coverage calculation
If ?_(i=1 )^64¦S i is addressing variant, then count of such ?_(i=1 )^64¦S i
Count of All Slices available ?_(i=1 )^64¦S i (64)
4Q weightage with Q having response or addressed
?_(i=1 )^4¦Q i is addressing variant, then count of such ?_(i=1 )^4¦Q i i
Count of All Slices available ?_(i=1 )^4¦Q i (4)
In the data variance model, data is arranged based on the following metrics:
4 questions (4Q) for controlling data variance
4 data blocks (4D)
4 continuous (4C) and measure (4M) blocks
The 4 questions are:
How to discover and document the current data environment.
How to benchmark current quality of the data.
How to evaluate intrinsic quality of data in databases (desirable data quality dimensions) and extrinsic contextual aspects of data quality.
How to provide data controls and checks to ensure the future data quality state remains high.
The four Data blocks are:
1. Data Governance
a. Understand the source, definition, and reliability of information at all level
b. Know who is accountable (Data/Asset Ownership)
2. Data Design/development
a. Technical mapping and development of variance measurement process (creation of Policy/Rules and other code for process)
b. View end-to-end lineage and impact analysis across data sources (lineage)
c. Deployment and version control for existing process with process improvements and enhancements activity
3. Data Visualization & Notification
a. Dashboards and Reports to easily monitor quality over time
b. Enforce standards about how the data is used and maintained
c. Alerts/Notification and user update for issues based on priority and severity
4. Data Remediation
a. Source System Application interface upgrade for key input fields validation
b. Data Patch or clean up jobs for fixing issues in quality during ETL process
c. Business logic change impacting the information (update logic flow code)
The four Continuous (4C) blocks approach of continuous improvement (C-IMP) with Continuous Requirement (CR), Continuous Integration (CI), Continuous Development (CD) and Continuous Automation (CA) around the 4D blocks provide insight on “How To” for transformation and incremental growth.
The four Measures (4M) around the process of “Continuous Improvement” provides a new approach to quality measurement process. It would provide delicate balance between the overall vision and a result-oriented solution.
The fours measures are:
Culture “bring information technology (IT)-business together”,
Security “Shift Left”,
Lean “Reduce waste, reuse and recycle”, and
Automation “Wisely and effectively” .
The written description describes the subject matter herein to enable any person skilled in the art to make and use the embodiments. The scope of the subject matter embodiments is defined by the claims and may include other modifications that occur to those skilled in the art. Such other modifications are intended to be within the scope of the claims if they have similar elements that do not differ from the literal language of the claims or if they include equivalent elements with insubstantial differences from the literal language of the claims.
The embodiments of present disclosure herein address unresolved problem of data variance measurement. The embodiment, thus provides a method and system for data variance measurement, in terms of data, continuous approach, and measure.
It is to be understood that the scope of the protection is extended to such a program and in addition to a computer-readable means having a message therein; such computer-readable storage means contain program-code means for implementation of one or more steps of the method, when the program runs on a server or mobile device or any suitable programmable device. The hardware device can be any kind of device which can be programmed including e.g., any kind of computer like a server or a personal computer, or the like, or any combination thereof. The device may also include means which could be e.g., hardware means like e.g., an application-specific integrated circuit (ASIC), a field-programmable gate array (FPGA), or a combination of hardware and software means, e.g., an ASIC and an FPGA, or at least one microprocessor and at least one memory with software processing components located therein. Thus, the means can include both hardware means and software means. The method embodiments described herein could be implemented in hardware and software. The device may also include software means. Alternatively, the embodiments may be implemented on different hardware devices, e.g., using a plurality of CPUs.
The embodiments herein can comprise hardware and software elements. The embodiments that are implemented in software include but are not limited to, firmware, resident software, microcode, etc. The functions performed by various components described herein may be implemented in other components or combinations of other components. For the purposes of this description, a computer-usable or computer readable medium can be any apparatus that can comprise, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device.
The illustrated steps are set out to explain the exemplary embodiments shown, and it should be anticipated that ongoing technological development will change the manner in which particular functions are performed. These examples are presented herein for purposes of illustration, and not limitation. Further, the boundaries of the functional building blocks have been arbitrarily defined herein for the convenience of the description. Alternative boundaries can be defined so long as the specified functions and relationships thereof are appropriately performed. Alternatives (including equivalents, extensions, variations, deviations, etc., of those described herein) will be apparent to persons skilled in the relevant art(s) based on the teachings contained herein. Such alternatives fall within the scope of the disclosed embodiments. Also, the words “comprising,” “having,” “containing,” and “including,” and other similar forms are intended to be equivalent in meaning and be open ended in that an item or items following any one of these words is not meant to be an exhaustive listing of such item or items, or meant to be limited to only the listed item or items. It must also be noted that as used herein and in the appended claims, the singular forms “a,” “an,” and “the” include plural references unless the context clearly dictates otherwise.
Furthermore, one or more computer-readable storage media may be utilized in implementing embodiments consistent with the present disclosure. A computer-readable storage medium refers to any type of physical memory on which information or data readable by a processor may be stored. Thus, a computer-readable storage medium may store instructions for execution by one or more processors, including instructions for causing the processor(s) to perform steps or stages consistent with the embodiments described herein. The term “computer-readable medium” should be understood to include tangible items and exclude carrier waves and transient signals, i.e., be non-transitory. Examples include random access memory (RAM), read-only memory (ROM), volatile memory, nonvolatile memory, hard drives, CD ROMs, DVDs, flash drives, disks, and any other known physical storage media.
It is intended that the disclosure and examples be considered as exemplary only, with a true scope of disclosed embodiments being indicated by the following claims.
, Claims:We Claim:
1. A processor implemented method (200), comprising steps of:
receiving (202), via a communication interface, one or more variances relevant to a data maturity assessment, as input data;
determining (204), via one or more hardware processors, each of the one or more variances as being associated with one of a plurality of variance categories;
mapping (206), via the one or more hardware processors, each of the one or more variances to one or more associated cube slices from among a plurality of cube slices in a data variance model, wherein each of the plurality of cube slices in the data variance model represents data maturity in terms of data, continuous approach, and measure; and
determining (208), via the one or more hardware processors, a data maturity score of the input data based on the mapping of each of the one or more variances to one or more associated cube slices from among a plurality of cube slices, wherein the data maturity score represents a measured data maturity of the input data.
2. The processor implemented method as claimed in claim 1, wherein each of the one or more variances is determined as being associated with one of the plurality of variance categories based on a pre-defined mapping of the one or more variances and the plurality of variance categories.
3. The processor implemented method as claimed in claim 1, wherein the input data is labelled based on the measured data maturity of the input data.
4. A system (100), comprising:
one or more hardware processors (102);
a communication interface (112); and
a memory (104) storing a plurality of instructions, wherein the plurality of instructions when executed, cause the one or more hardware processors to:
receive a selection on one or more variances relevant to a data maturity assessment, as input data;
determine each of the one or more variances as being associated with one of a plurality of variance categories;
map each of the one or more variances to one or more associated cube slices from among a plurality of cube slices in a data variance model, wherein each of the plurality of cube slices in the data variance model represents data maturity in terms of data, continuous approach, and measure; and
determine a data maturity score of the input data, based on the mapping of each of the one or more variances to one or more associated cube slices from among a plurality of cube slices, wherein the data maturity score represents a measured data maturity of the input data.
5. The system as claimed in claim 4, wherein the one or more hardware processors are configured to determine each of the one or more variances as being associated with one of the plurality of variance categories based on a pre-defined mapping of the one or more variances and the plurality of variance categories.
6. The system as claimed in claim 4, wherein the one or more hardware processors are configured to label the input data based on the measured data maturity of the input data.
Dated this 24th Day of February 2023
Tata Consultancy Services Limited
By their Agent & Attorney
(Adheesh Nargolkar)
of Khaitan & Co
Reg No IN-PA-1086
| # | Name | Date |
|---|---|---|
| 1 | 202321012790-STATEMENT OF UNDERTAKING (FORM 3) [24-02-2023(online)].pdf | 2023-02-24 |
| 2 | 202321012790-REQUEST FOR EXAMINATION (FORM-18) [24-02-2023(online)].pdf | 2023-02-24 |
| 3 | 202321012790-FORM 18 [24-02-2023(online)].pdf | 2023-02-24 |
| 4 | 202321012790-FORM 1 [24-02-2023(online)].pdf | 2023-02-24 |
| 5 | 202321012790-FIGURE OF ABSTRACT [24-02-2023(online)].pdf | 2023-02-24 |
| 6 | 202321012790-DRAWINGS [24-02-2023(online)].pdf | 2023-02-24 |
| 7 | 202321012790-DECLARATION OF INVENTORSHIP (FORM 5) [24-02-2023(online)].pdf | 2023-02-24 |
| 8 | 202321012790-COMPLETE SPECIFICATION [24-02-2023(online)].pdf | 2023-02-24 |
| 9 | 202321012790-Proof of Right [28-02-2023(online)].pdf | 2023-02-28 |
| 10 | 202321012790-FORM-26 [27-04-2023(online)].pdf | 2023-04-27 |
| 11 | Abstract1.jpg | 2023-11-16 |
| 12 | 202321012790-FER.pdf | 2025-08-14 |
| 1 | 202321012790_SearchStrategyNew_E_SearchHistoryE_22-05-2025.pdf |