Abstract: A system (100) for providing a real-time vehicle exposition technology includes a vehicle (102), a vehicle interface module (104), and a smart device (112). The vehicle interface module (104) is configured to extract vehicular raw data. The smart device (112) includes a vehicle event algorithms (VEA) calibration data module (114), a vehicle catalog and voice messages module (116), a vehicle event algorithms module (118), a Bluetooth Low Energy (BLE) communication interface module (120), and an operating system module (122). The vehicle event algorithms (VEA) calibration data module (114) is configured to calibrate the vehicle event algorithms (VEA) based on the extracted vehicle raw data. The vehicle catalog and voice messages module (116) are configured with a trained pre-recorded vehicle catalog and voice messages, The Bluetooth Low Energy (BLE)/Wifi communication interface module (120) is configured to interface the smart device (114) with the vehicle interface module (104).
Description:FIELD OF DISCLOSURE
[0001] Embodiments as disclosed herein relate to vehicle exposition, more particularly to providing a real-time vehicle exposition technology based on at least one real-time input provided by a user and at least one real-world atmospheric conditions.
BACKGROUND
[0002] In today’s digital age, prospective buyers typically conduct thorough research before making a purchase, often exploring various online channels to evaluate products like vehicles. Despite the abundance of online information, the virtual experience may sometimes prove unsatisfactory, prompting buyers to seek personalized consultations at physical dealerships. Interestingly, technologically savvy individuals tend to place a premium on in-person interactions, particularly when it comes to buying vehicles. They desire tailored advice and comprehensive product insights that exceed the online sources, highlighting the significance of a personalized approach in today's consumer landscape.
[0003] A customized and fulfilling experience could significantly influence car sales closures, particularly given the critical role of test-drives in the vehicle purchasing process, especially for both new and used cars. However, the current test-drive experience lacks personalization and context, often neglecting essential aspects like operating conditions, vehicle customization, physical attributes, commute patterns, and past ownership experiences.
[0004] Despite buyers attempting to compare new car features with their past experiences, they lack guidance for assessing personal fit and transparent knowledge about the vehicle. Additionally, they fail to provide performance metrics for auto manufacturers and dealerships or to effectively engage buyers during the test-drive through interactive dialogue, missing opportunities to enhance the overall user experience.
[0005] Currently, the process of providing the test-drive advice may involve human interaction with the buyers which may be expensive, and also the human may tend to miss explaining important features about the vehicle, which would have adverse impact in the decision making of the customer in purchasing the vehicle.
[0006] Therefore, there exists a need for advancement in vehicle exposition technology to overcome the aforementioned problems.
SUMMARY
[0007] Various objects, features, aspects and advantages of the inventive subject matter will become more apparent from the following detailed description of preferred embodiments, along with the accompanying drawing figures in which like numerals represent like components.
[0008] In one aspect of the present embodiment, a system for providing a real-time vehicle exposition technology is disclosed. The system includes a vehicle interface module and a smart device. The vehicle interface module is configured to extract vehicular raw data. The smart device includes a vehicle event algorithms (VEA) calibration data module, a vehicle catalog and voice messages module, a vehicle event algorithms module, a Bluetooth Low Energy (BLE)/Wifi communication interface module and an operating system module. The vehicle event algorithms (VEA) calibration data module is configured to calibrate the vehicle event algorithms. The vehicle catalog and voice messages module are configured with a trained pre-recorded vehicle catalog and voice messages, The Bluetooth Low Energy (BLE)/Wifi communication interface module is configured to interface the smart device with the vehicle interface module. The operating system module is configured to provide a software interface between the vehicle interface module and the smart device interface for a user. The vehicle event algorithms module performs loading one or more feature specific audio message, loading head up display page and connecting to the vehicle interface module, playing one or more messages on loading of head up display page about vehicle performance while a user is driving or riding the vehicle, based on the real-time input provided by the user, and providing feedback and report on the vehicle upon completion of the test drive.
[0009] In another aspect of the present embodiment, a method for providing a real-time vehicle exposition technology is disclosed. The method includes configuring a vehicle interface module, to extract vehicular raw data. The method further includes interfacing, the smart device, with the vehicle interface module. The method further includes calibrating, the vehicle event algorithms. The method further includes training, the smart device, with a pre-recorded vehicle catalog and voice messages based on the extracted vehicular raw data. The method further includes loading, by the smart device, one or more feature specific audio messages. The method further includes loading, by the smart device, a head up display page and connecting to the vehicle interface module. The method further includes playing, by the smart device, one or more messages on loading of head up display page about vehicle performance while a user is driving or riding the vehicle, based on the real-time input provided by the user. The method further includes providing, by the smart device, feedback and report on the vehicle upon completion of the test drive.
BRIEF DESCRIPTION OF THE DRAWINGS
[00010] The embodiments herein will be better understood from the following detailed description with reference to the drawings, in which:
[00011] FIG. 1 is an example block diagram for providing a real-time vehicle exposition technology, according to embodiments as disclosed herein;
[00012] FIG. 2 is an example block diagram for providing a Vehicle Event Algorithms (VEA), according to embodiments as disclosed herein;
[00013] FIG. 3 is an example diagram illustrating a method for providing a test ride to a driver based on the VEA module, according to embodiments as disclosed herein;
[00014] FIGs. 4A, 4B, 4C, 4D and 4E are example diagrams illustrating the method for providing the test ride to the driver based on the VEA module, according to embodiments as disclosed herein;
[00015] FIG. 5 is an example diagram illustrating the method for providing the test ride to the driver based on the VEA module, according to embodiments as disclosed herein and
[00016] FIG. 6 is an example flow chart showing a method for providing a real-time vehicle exposition technology, according to embodiments as disclosed herein.
[00017] FIG. 7 is an example diagram illustrating the interaction of a VEA module with a smart device using a cloud through a communication network for storing the data and maintenance of the VEA, according to embodiments as disclosed herein;
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS
[00018] The following discussion provides many example embodiments of the inventive subject matter. Although each embodiment represents a single combination of inventive elements, the inventive subject matter is considered to include all possible combinations of the disclosed elements. Thus, if one embodiment comprises elements A, B, and C, and a second embodiment comprises elements B and D, then the inventive subject matter is also considered to include other remaining combinations of A, B, C, or D, even if not explicitly disclosed.
[00019] The embodiments herein and the various features and advantageous details thereof are explained more fully with reference to the non-limiting embodiments that are illustrated in the accompanying drawings and detailed in the following description. Descriptions of well-known components and processing techniques are omitted so as to not unnecessarily obscure the embodiments herein. The examples used herein are intended merely to facilitate an understanding of ways in which the embodiments herein may be practiced and to further enable those of skill in the art to practice the embodiments herein. Accordingly, the examples should not be construed as limiting the scope of the embodiments herein.
[00020] Many modifications will be apparent to those skilled in the art without departing from the scope of the present invention as hereinbefore described with reference to the accompanying drawings.
[00021] Where a definition or use of a term in an incorporated reference is inconsistent or contrary to the definition of that term provided herein, the definition of that term provided herein applies and the definition of that term in the reference does not apply.
[00022] Throughout this specification and the claims which follow, unless the context requires otherwise, the word “comprise”, and variations such as “comprises” and “comprising”, will be understood to imply the inclusion of a stated integer or step or group of integers or steps but not the exclusion of any other integer or step or group of integers or steps.
[00023] As used herein, the singular forms “a”, “an”, “the” include plural referents unless the context clearly dictates otherwise. Further, the terms “like”, “as such”, “for example”, “including” are meant to introduce examples which further clarify more general subject matter, and should be contemplated for the persons skilled in the art to understand the subject matter.
[00024] In some embodiments, the numbers expressing quantities of ingredients, properties such as concentration, reaction conditions, and so forth, used to describe and claim certain embodiments of the invention are to be understood as being modified in some instances by the term “about.” Accordingly, in some embodiments, the numerical parameters set forth in the written description and attached claims are approximations that can vary depending upon the desired properties sought to be obtained by a particular embodiment. In some embodiments, the numerical parameters should be construed in light of the number of reported significant digits and by applying ordinary rounding techniques. Notwithstanding that the numerical ranges and parameters setting forth the broad scope of some embodiments of the invention are approximations, the numerical values set forth in the specific examples are reported as precisely as practicable. The numerical values presented in some embodiments of the invention may contain certain errors necessarily resulting from the standard deviation found in their respective testing measurements.
[00025] The recitation of ranges of values herein is merely intended to serve as a shorthand method of referring individually to each separate value falling within the range. Unless otherwise indicated herein, each individual value is incorporated into the specification as if it were individually recited herein. All methods described herein can be performed in any suitable order unless otherwise indicated herein or otherwise clearly contradicted by context. The use of any and all examples, or exemplary language (e.g. “such as”) provided with respect to certain embodiments herein is intended merely to better illuminate the invention and does not pose a limitation on the scope of the invention otherwise claimed. No language in the specification should be construed as indicating any non-claimed element essential to the practice of the invention.
[00026] FIG. 1 is a block diagram of the system for providing real-time vehicle exposition technology. A system 100 comprises a vehicle 102, a vehicle interface module 104, and a smart device 112. The vehicle interface module 104 is configured to extract vehicular raw data which may include, but is not limited to engine Revolutions per Minute (RPM), Manifold absolute pressure (MAP), engine load, number of gear (if available, else calibrated), vehicle speed, fuel status (for gasoline vehicles), brake status (if available, else calibrated), motor RPM, throttle, battery voltage, vehicle speed, regenerative braking status and so on.
[00027] The smart device 112 comprises a vehicle event algorithms (VEA) calibration data module 114, a vehicle catalog and voice messages module 116, a vehicle event algorithms module 118, a Bluetooth Low Energy (BLE)/WIFI communication interface module 120 and an operating system module 122. The vehicle event algorithms (VEA) calibration data module 114 is configured to calibrate the vehicle event algorithms (VEA) based on the extracted vehicular raw data. Processing of the extracted vehicular raw data can be configured based on the vehicle event algorithms (VEA) to describe the vehicle featured real-time based on the encountered event/scenario.
[00028] Vehicle event algorithms (VEA) can be configured to use supervised Machine Learning (ML) to obtain the vehicular raw data to recognize powertrain (engine/motor) status, which may include, but are not limited to idle, running, acceleration, braking, gear-change and the like. The values are learnt during the development stage. The learnt values are ‘inserted’ in the specific VEA module 118 for detecting the real-time in-vehicle events which in turn trigger a relevant voice-based explanation of the feature. The feature elucidation is specific to the event. In other words, each event has its own algorithm and learnt values.
[00029] In an embodiment, during hard acceleration, the system may be configured to instantly recognize the accelerator pedal usage (based on the VEA module 118) which is a real-time in-vehicle event and explain about the vehicle’s performance prowess or the powertrain’s unique design. Also, the VEA module 118 is configured to identify only the aggressive pedal movement used for the explanation and not just any input to the pedal. Further, the system may be configured to identify the particular feature which has been explained and thereby the feature is not repeated to ensure good user-experience.
[00030] In another embodiment, using the on-board accelerometer and gyroscope of the smart device 112, can detect road undulation in real-time, and can be configured to instantly explain the new suspension system of the vehicle. The pattern or absolute values which get generated during the event when the vehicle encounters a road adulation or the bump is part of the supervised machine learnt data which is fed into the VEA module 118.
[00031] The vehicle catalog and voice messages module 116 is configured with trained pre-recorded vehicle catalog and voice messages. The Bluetooth Low Energy (BLE)/Wifi communication interface module 120 is configured to interface the smart device 112 with the vehicle interface module 104. The operating system module 122 can be configured with operating systems which may include, but are not limited to Android ®, Microsoft Windows ®, and iOS ® to provide an interface between the vehicle interface module 104 and the smart device 112 interface for an user. The vehicle event algorithms module 118 can be configured to load one or more feature specific audio message, head up display page and connect to the vehicle interface module. The vehicle event algorithms module 118 can play one or more messages on loading of head up display page and vehicle performance while a user is driving or riding the vehicle. The vehicle event algorithms module 118 can provide feedback and report on the vehicle upon completion of the Test drive.
[00032] In another embodiment, the vehicle 102 can be a fuel vehicle or an electric vehicle. Further, the vehicle 102 can be a four-wheeler, a two-wheeler and the like.
[00033] In some embodiments, the vehicular raw data is OBD-II data which is ‘public domain’ data. The vehicle manufacturers have made powertrain data available through the OBD port for troubleshooting. But the OBD-II data is basic, restricted and limited. As an example, the VEA for gear detection was created because no direct message is available from the OBD-II port which carries gear information. The system 100 uses OBD-II data to build complex algorithms/models for providing Real Time information the driver about the features of the vehicle based on inputs from the driver (such as press of the brake, or on recognizing an event, or detecting a particular gear.
[00034] In some embodiments, the raw data includes engine RPM, MAP, engine Load, Gear number. For EVs, the raw data includes Motor RPM, Throttle, Overall Battery Voltage, Vehicle Speed, Brake Status, Regenerative braking status. The vehicle event algorithms module 118 includes a supervised learning module.
[00035] In an embodiment, the system 100 detects second and third gear used by the user and plays one or more preloaded messages about vehicles performance while driving or riding in the second and third gear. Further, the system 100 plays a welcome message to the user. The system 100 detects if the user is driving fast and accordingly plays a preloaded message related to the fast driving. The system 100 also detects over run of the vehicle and plays a preloaded message related to the over run.
[00036] In another embodiment, the system 100 can be used in Internal Combustion, Electric or Hybrid vehicles. Further, the system 100 is also applicable for bikes (all 2-wheelers) and cars (all 4-wheelers). The system 100 helps a vehicle manufacturer market their vehicle to a potential customer during a test-drive. For example, the system 100 is a technology with a voice for providing relevant explanation of the vehicle during vehicle test drive.
[00037] Embodiments herein use the terms such as “driver”, “rider”, “user”, “customer”, and so on interchangeably to refer to a person riding/driving the vehicle to experience the test ride based on the VEA module.
[00038] FIG. 2 is a block diagram representing building of vehicle event algorithms model and their deployment in the smart device. As shown in FIG. 2, the said process comprises of two steps. The first step is train phase and the second step is deployment in the smart device. The train phase comprises loading data, pre-processing and supervised learning. The supervised learning phase comprises classification and regression. The data pre-processing is the first step in creating a machine learning model and involves preparing raw data for the model. The data pre-processing process includes cleaning, normalization, transformation feature extraction, and selection. This is achieved by using a combination of cluster analysis, filters and summary statistics. In Cluster analysis, data is classified into structures that are easier to understand and manipulate. It identifies commonalities in the data and reacts based on the presence or absence of such commonalities in each new piece of data. For example, data specific to an RPM Band is classified together. Summary statistics is used to determine key information about the data in the complete sample. For example, minimum value, maximum value, and mean value etc. After this, filters are used to select valuable information from a large data set. For example, torque value at peak engine load is filtered.
[00039] In the supervised learning phase, multiple methods are used based on the different models in VEA. The classification can find functions that helps in dividing the dataset into classes based on various parameters. When using the classification, program gets taught on the training dataset and categorizes the data into various categories depending on the learnt data. Classification can predict the category the data belongs to. Using this method, VEA models compute/predict the various engine states and full load conditions. The model uses calibration data like learnt idle engine load, idle engine rpm to predict idle engine state.
[00040] The regression is used to analyse the relationship between a dependent variable (target variable) and one or more independent variables (predictor variables). The objective is to determine the most suitable function that characterizes the connection between these variables. VEA models use linear and polynomial regression methods for fuel flow calculation. Once the VEA models are built and calibrated, they are deployed in the smart device application. Real time data is first Pre-processed and then passed through the calibrated VEA Models to identify events during the drive.
[00041] FIG. 3 is an example diagram illustrating a method for providing a test ride to a driver based on the VEA module. As illustrated in FIG.3, the user desirous of performing the test ride should login to the application and add the vehicle. The smart device can be configured to load the VEA module for identifying the real-time in-vehicle event and fetch the calibrated data. Further, the system may be configured to load the features specific audio message based on the identified real-time in-vehicle event. The driver may continue the test drive/ riding process.
[00042] FIGs. 4A, 4B, 4C, 4D and 4E are example diagrams illustrating the method for providing the test ride to the driver based on the VEA module. As illustrated in FIG. 4A, the driver/customer on boarding the vehicle may provide the details to the smart device, the system may provide “welcome message” with the ignition on to start the test ride/drive. As illustrated in FIG. 4B, the system may load head up display page and connect the vehicle-to-vehicle interface module. The system may play message on load of the head up display page which may play next feature message. Further, the system may detect the vehicle start and provide the message.
[00043] As illustrated in FIG. 4C, the system may detect the take-off and play message. Further, the system may detect second gear and play message. The system may further detect second gear acceleration and play message. The system can detect third gear shift with the rising edge and play message.
[00044] As illustrated in FIG. 4D, the system can detect fast drive and provide to the rider. On identifying the downshift with the falling edge and provide the corresponding message. The system can detect the overrun and play the corresponding message. The system can detect cruising and provide corresponding message. As illustrated in FIG. 4E, the system can detect brake and provide the message. The system can detect the vehicle is back to the start location and provide corresponding message. The system on identifying the stop of the test ride/drive can provide corresponding message. Finally, the system can provide feedback and report the vehicle ride based on VEA module.
[00045] FIG. 5 is an example diagram illustrating the method for providing the test ride to the driver based on the VEA module. As illustrated in FIG. 5, the system can be configured to identify various events occurred while user experience test ride. The rider on boarding the vehicle may receive the “welcome message” on detecting the ignition on. The system can be configured to identify the events occurring while the rider experiences the test ride, which may include, but are not limited to, starting the vehicle, identifying the shift in gear, identifying the gear acceleration with the hard pedal press, identifying the fast drive, detecting the downshift, detecting the overrun, cruising, braking, and the like.
[00046] For example, the brake detection through the supervised machine learning comprises
dV/dT = Acc. (+) or Decel. (-)
wherein the supervised Machine Learning for Coasting/Engine Braking Threshold (C)
Braking VEA = Decel. > (C)
[00047] For example, the gear detection through the supervised machine learning comprises
Vehicle Speed/Engine Speed = (R)
wherein the supervised Machine Learning for (R) in each gear gives a ratio range unique to each gear, thereby enabling Gear Detection.
[00048] For example, a Fuel consumption is a complex thing to analyse, which is built around regression route. Calculated fuel flow is used for calculating engine torque (at the crankshaft). When a pre-taught torque value is encountered during drive/ride, the VEA for engine performance is recognized and accordingly a real-time relevant message is relayed to the potential customer.
[00049] FIG. 6 is an example flow chart showing a method 700 for providing a real-time vehicle exposition technology. At step 702, the method 700 includes configuring, a vehicle interface module 104, to extract vehicular raw data. At step 704, the method 700 includes interfacing, the smart device 112, with the vehicle interface module 104. At step 706, the method 700 includes calibrating, a smart device 112, with the extracted vehicle raw data. At step 708, the method 700 includes training, the smart device 112, with a pre-recorded vehicle catalogue and voice messages based on the extract vehicular raw data. At step 710, the method 700 includes loading, by the smart device 112, one or more feature specific audio messages. At step 712, the method 700 includes loading, by the smart device 112, a head up display page and connecting to the vehicle interface module. At step 714, the method 700 includes playing, by the smart device 112, one or more messages on load of head up display page and vehicle performance while a user is driving or riding the vehicle. At step 716, the method 700 includes providing, by the smart device 112, feedback and report on the vehicle.
[00050] FIG. 7 is an example diagram illustrating the interaction of a VEA module with a smart device using a cloud through a communication network for storing the data and maintenance of the VEA, according to embodiments as disclosed herein;
[00051] While various examples of the present disclosure have been described above, it should be understood that they have been presented by way of example, and not limitation. Thus, the breadth and scope of the present disclosure should not be limited by any of the above-described examples, but should be defined in accordance with the following claims and their equivalents.
, Claims:I/We claim:
1. A system (100) for providing a real-time vehicle exposition technology, the system (100) comprising:
a vehicle interface module (104) configured to extract vehicular raw data comprising at least one of engine Revolutions Per Minute (RPM), engine load, vehicle speed, brake status, motor RPM, throttle, battery voltage, and regenerative braking status;
a smart device (112) comprising:
a vehicle event algorithms (VEA) calibration data module (114) configured to calibrate the vehicle event algorithms (VEA) based on the extracted vehicular raw data,
a vehicle catalog and voice messages module (116) configured with trained pre-recorded vehicle catalog and voice messages,
a vehicle event algorithms (VEA) module (118) configured to insert the calibrated data, process and detect real-time in-vehicle events using supervised machine learning technique,
a Bluetooth Low Energy (BLE)/Wifi communication interface module (120) configured to interface the smart device (112) with the vehicle interface module (104), and
an operating system module (122).
2. The system as claimed in claim 1, wherein the vehicle interface module (104) is configured to extract additional vehicular raw data, comprising at least one of Manifold absolute pressure (MAP) data, number of gears, fuel status for vehicles, and engine temperature.
3. The system as claimed in claim 1, wherein the vehicle event algorithms (VEA) module (118) is configured to recognize powertrain states, comprising at least one of idle, running, acceleration, braking, and gear change based on the extracted vehicular raw data.
4. The system as claimed in claim 1, wherein the Bluetooth Low Energy (BLE)/Wifi communication interface module (120) is configured to establish a secure communication link between the smart device (112) and the vehicle interface module (104).
5. The system as claimed in claim 1, wherein the operating system module (122) is configured to provide a user-friendly interface for interacting with the vehicle exposition technology.
6. A method for providing a real-time vehicle exposition technology, the method comprising:
configuring a vehicle interface module (104) to extract vehicular raw data;
interfacing a smart device (112) with the vehicle interface module (104);
calibrating the vehicle event algorithms (VEA) based on the extracted vehicular raw data;
training, the smart device (112), with a pre-recorded vehicle catalog and voice messages based on the extracted vehicular raw data;
loading, by the smart device (112), one or more feature specific audio messages;
loading, by the smart device (112), a head up display page and connecting to the vehicle interface module (104);
playing, by the smart device (112), one or more messages on load of head up display page about vehicle performance while a user is driving or riding the vehicle, based on the real-time input provided by the user; and
providing, by the smart device (112), feedback and report on the vehicle upon completion of the test drive.
7. The method as claimed in claim 6, wherein the vehicle interface module (104) is configured to extract additional vehicular raw data, comprising at least one of Manifold absolute pressure (MAP) data, number of gears, fuel status for vehicles, and engine temperature.
8. The method as claimed in claim 6, wherein the vehicle event algorithms (VEA) module (118) is configured to recognize powertrain states, comprising at least one of idle, running, acceleration, braking, and gear change based on the extracted vehicular raw data.
9. The method as claimed in claim 6, wherein the Bluetooth Low Energy (BLE)/Wifi communication interface module (120) is configured to establish a secure communication link between the smart device (112) and the vehicle interface module (104).
10. The method as claimed in claim 6, wherein the operating system module (122) is configured to provide a user-friendly interface for interacting with the vehicle exposition technology.
| # | Name | Date |
|---|---|---|
| 1 | 202441056000-STARTUP [23-07-2024(online)].pdf | 2024-07-23 |
| 2 | 202441056000-REQUEST FOR EARLY PUBLICATION(FORM-9) [23-07-2024(online)].pdf | 2024-07-23 |
| 3 | 202441056000-POWER OF AUTHORITY [23-07-2024(online)].pdf | 2024-07-23 |
| 4 | 202441056000-OTHERS [23-07-2024(online)].pdf | 2024-07-23 |
| 5 | 202441056000-FORM28 [23-07-2024(online)].pdf | 2024-07-23 |
| 6 | 202441056000-FORM-9 [23-07-2024(online)].pdf | 2024-07-23 |
| 7 | 202441056000-FORM-5 [23-07-2024(online)].pdf | 2024-07-23 |
| 8 | 202441056000-FORM FOR STARTUP [23-07-2024(online)].pdf | 2024-07-23 |
| 9 | 202441056000-FORM FOR SMALL ENTITY(FORM-28) [23-07-2024(online)].pdf | 2024-07-23 |
| 10 | 202441056000-FORM 3 [23-07-2024(online)].pdf | 2024-07-23 |
| 11 | 202441056000-FORM 18A [23-07-2024(online)].pdf | 2024-07-23 |
| 12 | 202441056000-FORM 1 [23-07-2024(online)].pdf | 2024-07-23 |
| 13 | 202441056000-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [23-07-2024(online)].pdf | 2024-07-23 |
| 14 | 202441056000-DRAWINGS [23-07-2024(online)].pdf | 2024-07-23 |
| 15 | 202441056000-COMPLETE SPECIFICATION [23-07-2024(online)].pdf | 2024-07-23 |
| 16 | 202441056000-FER.pdf | 2024-08-20 |
| 17 | 202441056000-OTHERS [06-02-2025(online)].pdf | 2025-02-06 |
| 18 | 202441056000-FER_SER_REPLY [06-02-2025(online)].pdf | 2025-02-06 |
| 19 | 202441056000-Request Letter-Correspondence [31-07-2025(online)].pdf | 2025-07-31 |
| 20 | 202441056000-Covering Letter [31-07-2025(online)].pdf | 2025-07-31 |
| 1 | searchE_19-08-2024.pdf |
| 2 | 202441056000_SearchStrategyAmended_E_searchAE_27-02-2025.pdf |