Sign In to Follow Application
View All Documents & Correspondence

System And Method For Processing Roadway Data To Detect Road Deformities

Abstract: The present invention discloses a system and a method for processing roadway data to detect one or more road deformities. The system (102) comprises sensors (108) to generate the raw roadway data during roadway surveillance. A bespoke protocol is used to packetize the roadway data into structured packets for optimal transmission to a graphics processing unit (GPU)-based processing unit. The system (102) depacketizes the received structured packets by reconstructing the roadway data and storing it in one or more databases (104). A proprietary synchronization model aligns the reconstructed data with geospatial coordinates and timestamps, ensuring millisecond-level accuracy. The synchronized data is processed using one or more artificial intelligence (AI) or machine learning (ML) models trained on structured datasets to detect road deformities such as potholes and cracks. The system (102) ensures reliable data handling, real-time processing, and accurate detection, making it essential for road safety and infrastructure monitoring. Figure 2B

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
24 January 2024
Publication Number
32/2025
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
Parent Application

Applicants

WINEURAL TECHNOLOGIES PRIVATE LIMITED
NO. C-634 KH NO-846, JVTS GARDEN, CHATTERPUR EXTN DELHI, SOUTH-WEST DELHI, DELHI- 110074, INDIA

Inventors

1. SHARAD KUMAR PANDEY
NO. C-634 KH NO-846 JVTS GARDEN, CHATTERPUR EXTN, SOUTH-WEST DELHI, DELHI- 110074, INDIA
2. SARTAJ SINGH SIDHU
NO. C-634 KH NO-846 JVTS GARDEN, CHATTERPUR EXTN, SOUTH-WEST DELHI, DELHI- 110074, INDIA

Specification

DESC: EARLIEST PRIORITY DATE: This Application claims priority from a Provisional patent application filed in India having Patent Application No. 202311079927, filed on 24th January 2024 and titled “SYSTEM AND METHOD FOR ANALYSING ROADWAYS” FIELD OF INVENTION Embodiments of the present invention relate to road monitoring and maintenance systems, and more particularly relate to a computer-implemented system and method for processing roadway data to detect one or more road deformities. BACKGROUND Roadway infrastructure outlines a dire component of modern transportation systems. A condition of road surfaces directly impacts road safety, vehicle maintenance, and transportation efficiency. Monitoring and maintaining road conditions are essential for ensuring the road safety and functionality of the roadway infrastructure. In recent years, there is a growing emphasis on developing advanced systems for monitoring and maintaining the road infrastructure. The emphasis on developing advanced systems for monitoring and maintaining the road infrastructure is driven by a combination of technological progress, safety concerns, economic considerations, environmental awareness, data-driven approaches, and public expectations. These combinations collectively contribute to a greater recognition of the importance of efficient and effective road monitoring and maintenance systems. The safety of road users is an overriding concern for governments and transportation authorities worldwide. As road networks expand and traffic volumes increase, a need for appropriate monitoring and maintenance systems is increasingly critical to ensure the road safety. However, the conventional road monitoring and maintenance systems rely on manual inspections, visual assessments, and rudimentary data collection methods. These methods have limitations, including subjectivity in assessment, delayed response to road anomalies, and insufficient data for predictive maintenance. In an existing technology, an analytics platform for identifying a roadway anomaly is disclosed. The analytics platform is configured to acquire anomaly data related to a roadway irregularity. The anomaly data is sourced from a sensor subsystem featuring a sensor array. The analytics platform is equipped to ascertain a plurality of filters linked to a classification of the roadway anomaly into a specific category. The plurality of filters is subsequently employed by the analytics platform to process the anomaly data associated with roadway irregularities. As a result of this application of the plurality of filters, the analytics platform is configured to recognize the roadway anomaly as belonging to the specific category. However, the ability of the analytics platform to identify the roadway anomaly is contingent on the predefined set of the plurality of filters associated with a particular anomaly type. If a new or previously unidentified type of anomaly arises, the analytics platform is incapable of classifying the unidentified type of anomaly appropriately. Furthermore, if the sensor subsystem malfunctions or provides inaccurate anomaly data, the sensor subsystem may lead to erroneous categorization of the roadway anomalies. There are various technical problems with the road monitoring and maintenance in the prior art. In the existing technology, predictive maintenance is challenging to implement with the conventional road monitoring and maintenance systems. The conventional road monitoring and maintenance systems lack the ability to predict road damage or deterioration trends accurately. Additionally, the conventional road monitoring and maintenance systems lack advanced data analysis capabilities to derive meaningful insights from the collected data, thereby hindering inspections and maintenance operations of the conventional road monitoring and maintenance systems. Therefore, there is a need for a system to address the aforementioned issues by providing an innovative approach to data collection, synchronization, and analysis, ensuring efficient and accurate assessment of road conditions, ultimately contributing to enhanced road safety and maintenance practices. SUMMARY This summary is provided to introduce a selection of concepts, in a simple manner, which is further described in the detailed description of the disclosure. This summary is neither intended to identify key or essential inventive concepts of the subject matter nor to determine the scope of the disclosure. In order to overcome the above deficiencies of the prior art, the present disclosure is to solve the technical problem to provide a computer-implemented system and method for processing roadway data to detect one or more road deformities. In accordance with an embodiment of the present disclosure, the computer-implemented method for processing the roadway data to detect the one or more road deformities is provided. In the first step, the computer-implemented method includes generating, by one or more sensors, raw roadway data associated with one or more roadways. In the next step, the computer-implemented method includes generating the raw roadway data comprises: a) providing, by one or more hardware processors through a time-stamping module, one or more timestamps to the raw roadway data based on an internal clock associated with each sensor of the one or more sensors operated at first defined frequencies, and b) providing, by the one or more hardware processors through the time-stamping module, the one or more timestamps to geospatial coordinates data generated from a space-based radio-navigation unit operated at second defined frequencies based on an elevated-frequency clock. In the next step, the computer-implemented method includes packetizing, by the one or more hardware processors through a data packetizing subsystem, the raw roadway data into one or more packets with a defined structure through a bespoke protocol to optimal transmission of the packetized roadway data to a graphics processing unit (GPU)-based processing unit. The packetized raw roadway data by the bespoke protocol comprises: a) metadata indicating sensor data identification, the one or more timestamps, the geospatial coordinates data, and packet sequence number, and b) one or more error-checking mechanisms comprise at least one of: cyclic redundancy check (CRC) and a checksum mechanism for ensuring the integrity of the packetized data. The defined structure comprises at least one of: header data, payload data, and footer data. The header data comprises at least one of: sensor identification data, geospatial coordinates data, the one or more timestamps, packet sequence numbers, data type identifiers, and error-checking codes for data integrity verification. The payload data comprises at least one of: images, videos, sensor measurements, and geospatial data. The footer data comprises at least one of: the one or more error-checking mechanisms and end-of-packet markers, for ensuring data integrity and detecting transmission completion. In the next step, the computer-implemented method includes depacketizing, by the one or more hardware processors through a data depacketizing subsystem, each packet of the one or more packets by reconstructing the roadway data based on the defined structure and stored in one or more databases. The depacketizing comprises: a) validating, by the one or more hardware processors through a validation module, each received packet of the one or more packets through one or more error-checking mechanisms, and b) reconstructing, by the one or more hardware processors through a reconstructing module, the roadway data based on the metadata to reorder out-of-sequence packets within the one or more packets. In the next step, the computer-implemented method includes synchronising, by the one or more hardware processors through a synchronisation subsystem, the reconstructed roadway data from the one or more sensors associated with the one or more timestamps against geospatial coordinates data through a proprietary synchronisation model. The proprietary synchronisation model configured to employ at least one of: a linear interpolation process, a clock drift compensation, and a multi-threaded processing technique for millisecond-level synchronisation of the roadway data. The computer-implemented method includes synchronising through the proprietary synchronisation model comprises: a) aligning the one or more timestamps associated with the second defined frequencies with the one or more timestamps associated with the first defined frequencies based on at least one of: interpolating and multi-threading of each vision frame timestamp associated with the first defined frequencies using at least two adjoining geospatial coordinates, b) performing periodic recalibration of at least one of: the internal clock and the elevated-frequency clock to a master clock associated with the graphics processing unit (GPU)-based processing unit, based on a predefined drift tolerance threshold, and c) performing a dual layer synchronisation between the raw roadway data and the geospatial coordinates data through: a hardware-based initial synchronisation employing global positioning system (GPS) signals with the microsecond-level synchronisation, and an instruction-based secondary synchronisation employing the linear interpolation process and the clock drift compensation. In the next step, the computer-implemented method includes processing, by the one or more hardware processors through a data processing subsystem, the synchronised roadway data through at least one of: one or more artificial intelligence (AI) models and one or more machine learning (ML) models trained on one or more structured datasets to detect the one or more road deformities. At least one of: the one or more AI models and the one or more ML models including deep learning models such as a Single Shot Multibox Detector (SSD) framework. The SSD framework with a MobileNetV2 backbone trained on the one or more structured datasets. The one or more structured datasets comprise regional roadways data with one or more characteristics and one or more classifications comprise at least one of: varying road deformities, environmental challenges, and construction materials. In the next step, the computer-implemented method includes a) storing the synchronised roadway data and data associated with the detected one or more road deformities in the one or more databases, b) post-processing the stored roadway data through a post-processing technique comprises a non-maximum suppression (NMS) to refine one or more bounding boxes to detect the one or more road deformities, and c) generating, by a data analytics engine configured with a graphical user interface unit, at least one of: an analysis output data, an inspection dataset, detection summary data, diagnostic reports, roadway condition profiles, and visual representation data, based on the post-processed roadway data. In accordance with another embodiment of the present disclosure, the computer-implemented system for processing the roadway data to detect the one or more road deformities is provided. The computer-implemented system comprises one or more sensors, one or more hardware processors, and a memory unit. The memory unit coupled to the one or more hardware processors, that comprises a plurality of subsystems in the form of programmable instructions executable by the one or more hardware processors. The plurality of subsystems comprises the data packetizing subsystem, the data depacketizing subsystem, the synchronisation subsystem, and the data processing subsystem. In an embodiment, the data packetizing subsystem is configured to packetize the raw roadway data into the one or more packets with the defined structure through the bespoke protocol for optimal transmission of the packetized roadway data to the GPU-based processing unit. The data depacketizing subsystem is configured to depacketize each packet of the one or more packets by reconstructing the roadway data based on the defined structure and stored in the one or more databases. The synchronisation subsystem is configured to synchronise the reconstructed roadway data from the one or more sensors associated with the one or more timestamps against geospatial coordinates data through the proprietary synchronisation model. The proprietary synchronisation model is configured to employ at least one of: the linear interpolation process, the clock drift compensation, and the multi-threaded processing technique for millisecond-level synchronisation of the roadway data. Further, the data processing subsystem is configured to process the synchronised roadway data through at least one of: one or more AI models and one or more ML models trained on the one or more structured datasets for detecting the one or more road deformities. At least one of: the one or more AI models and the one or more ML models including deep learning models such as a Single Shot Multibox Detector (SSD) framework. To further clarify the advantages and features of the present invention, a more particular description of the invention will follow by reference to specific embodiments thereof, which are illustrated in the appended figures. It is to be appreciated that these figures depict only typical embodiments of the invention and are therefore not to be considered limiting in scope. The invention will be described and explained with additional specificity and detail with the appended figures. BRIEF DESCRIPTION OF THE DRAWINGS The disclosure will be described and explained with additional specificity and detail with the accompanying figures in which: Figure 1 illustrates an exemplary block diagram representation of a network architecture depicting a computer-implemented system for processing roadway data to detect one or more road deformities, in accordance with an embodiment of the present disclosure; Figure 2A illustrates an exemplary block diagram representation of the computer-implemented system as shown in Figure 1 for processing the roadway data, in accordance with an embodiment of the present invention; Figure 2B illustrates an exemplary flow diagram representation of the computer-implemented system, in accordance with an embodiment of the present invention Figure 3 illustrates exemplary flowchart depicting a packetizing process and a depacketizing process of the roadway data associated with the computer-implemented system, in accordance with an embodiment of the present invention, in accordance with an embodiment of the present disclosure; and Figure 4 illustrates exemplary flowchart of a computer-implemented method for processing roadway data to detect one or more road deformities, in accordance with an embodiment of the present invention, in accordance with an embodiment of the present disclosure. Further, those skilled in the art will appreciate that elements in the figures are illustrated for simplicity and may not have necessarily been drawn to scale. Furthermore, in terms of the method steps, chemical compounds, equipment and parameters used herein may have been represented in the figures by conventional symbols, and the figures may show only those specific details that are pertinent to understanding the embodiments of the present disclosure so as not to obscure the figures with details that will be readily apparent to those skilled in the art having the benefit of the description herein. DETAILED DESCRIPTION OF THE PRESENT INVENTION For the purpose of promoting an understanding of the principles of the disclosure, reference will now be made to the embodiment illustrated in the figures and specific language will be used to describe them. It will nevertheless be understood that no limitation of the scope of the disclosure is thereby intended. Such alterations and further modifications in the illustrated system, and such further applications of the principles of the disclosure as would normally occur to those skilled in the art are to be construed as being within the scope of the present disclosure. The terms "comprises", "comprising", or any other variations thereof, are intended to cover a non-exclusive inclusion, such that a process or method that comprises a list of steps does not include only those steps but may include other steps not expressly listed or inherent to such a process or method. Similarly, one or more components, compounds, and ingredients preceded by "comprises... a" does not, without more constraints, preclude the existence of other components or compounds or ingredients or additional components. Appearances of the phrase "in an embodiment", "in another embodiment" and similar language throughout this specification may, but not necessarily do, all refer to the same embodiment. Unless otherwise defined, all technical and scientific terms used herein have the same meaning as commonly understood by those skilled in the art to which this disclosure belongs. The system, methods, and examples provided herein are only illustrative and not intended to be limiting. In the following specification and the claims, reference will be made to a number of terms, which shall be defined to have the following meanings. The singular forms “a”, “an”, and “the” include plural references unless the context clearly dictates otherwise. Embodiments of the present disclosure relate to a computer-implemented system and method for processing roadway data to detect one or more road deformities Figure 1 illustrates an exemplary block diagram representation of a network architecture 100 depicting a computer-implemented system 102 for processing roadway data to detect one or more road deformities, in accordance with an embodiment of the present disclosure. According to an exemplary embodiment of the present disclosure, Figure 1 depicts the network architecture 100 may include the computer-implemented system 102, one or more databases 104, one or more communication devices 106, and one or more sensors 108. The computer-implemented system 102 the one or more databases 104, and the one or more communication devices 106 may be communicatively coupled via one or more communication networks 116, ensuring seamless data transmission, processing, and decision-making. The computer-implemented system 102 acts as a central processing unit within the network architecture 100, responsible for processing roadway data to detect one or more road deformities. The computer-implemented system 102 is configured to execute a set of computer-readable instructions that control a plurality of subsystems 114. In an exemplary embodiment, the computer-implemented system 102 one or more hardware processors 110 and a memory unit 112. The memory unit 112 is operatively connected to the one or more hardware processors 110. The memory unit 112 comprises a set of computer-readable instructions in the form of the plurality of subsystems 114, configured to be executed by the one or more hardware processors 110. In an exemplary embodiment, the computer-implemented system 102 comprises one or more servers 118. The one or more servers 118 may comprise a combination of discrete components, an integrated circuit, an application-specific integrated circuit, a field-programmable gate array, a digital signal processor, or other suitable hardware. The “software” may comprise one or more objects, agents, threads, lines of code, subroutines, separate software applications, two or more lines of code, or other suitable software structures operating in one or more software applications or the one or more hardware processors 110. In an exemplary embodiment, the one or more hardware processors 110 may include, for example, microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, state machines, logic circuits, and/or any devices that manipulate data or signals based on operational instructions. Among other capabilities, the one or more hardware processors 110 may fetch and execute computer-readable instructions in the memory unit 112 operationally coupled with the computer-implemented system 102 for performing tasks such as data processing, input/output processing, and/or any other functions. Any reference to a task in the present disclosure may refer to an operation being or that may be performed on raw roadway data. The one or more hardware processors 110 are high-performance processors capable of handling large volumes of the raw roadway data and complex computations. The one or more hardware processors 110 may be, but not limited to, at least one of: multi-core central processing units (CPU), a graphics processing unit (GPU)-based processing unit 214, and the like that enhance an ability of the computer-implemented system 102 to process the real-time raw roadway data from one or more sensors 108 simultaneously. In an exemplary embodiment, the one or more databases 104 may configured to store and manage data related to various aspects of the computer-implemented system 102. The one or more databases 104 may store at least one of, but not limited to, the raw roadway data, packetized data, synchronized data, processed data, one or more structured datasets, error logs and validation data, historical maintenance data, at least one of: an analysis output data, an inspection dataset, detection summary data, diagnostic reports, roadway condition profiles, and visual representation data, configuration details of the computer-implemented system 102, and the like. The one or more databases 104 serve as a centralized repository for critical data elements that are integral to the secure operation of the computer-implemented system 102, enabling efficient management and synchronization of data associated with the computer-implemented system 102. The one or more databases 104 enable the computer-implemented system 102 to dynamically retrieve, analyse, and update the stored raw roadway data in real-time, for processing the roadway data to detect the one or more road deformities. The one or more databases 104 may include different types of databases such as, but not limited to, relational databases (e.g., Structured Query Language (SQL) databases), non-Structured Query Language (NoSQL) databases (e.g., MongoDB, Cassandra), time-series databases (e.g., InfluxDB), an OpenSearch database, object storage systems, and the like. In an exemplary embodiment, the one or more communication devices 106 are configured to enable one or more users to interact with the computer-implemented system 102. The one or more communication devices 106 may be digital devices, computing devices, and/or networks. The one or more communication devices 106 may include, but not limited to, a mobile device, a smartphone, a personal digital assistant (PDA), a tablet computer, a phablet computer, a wearable computing device, a virtual reality/augmented reality (VR/AR) device, a laptop, a desktop, and the like. The one or more communication devices 106 are configured with a user interface configured to enable seamless interaction between the one or more users and the computer-implemented system 102. The user interface may include the graphical user interfaces (GUIs) units, voice-based interfaces, and touch-based interfaces, depending on the capabilities of the computer-implemented system 102 being used. The GUIs units may be configured to display outputs, including at least one of: the analysis output data, the inspection dataset, the detection summary data, the diagnostic reports, the roadway condition profiles, and the visual representation data. In an exemplary embodiment, the one or more communication networks 116 may be, but not limited to, a wired communication network and/or a wireless communication network, a local area network (LAN), a wide area network (WAN), a Wireless Local Area Network (WLAN), a metropolitan area network (MAN), a telephone network, such as the Public Switched Telephone Network (PSTN) or a cellular network, an intranet, the Internet, a fibre optic network, a satellite network, a cloud computing network, a combination of networks, and the like. The wired communication network may comprise, but not limited to, at least one of: Ethernet connections, Fiber Optics, Power Line Communications (PLCs), Serial Communications, Coaxial Cables, Quantum Communication, Advanced Fiber Optics, Hybrid Networks, and the like. The wireless communication network may comprise, but not limited to, at least one of: wireless fidelity (wi-fi), cellular networks (including fourth generation (4G) technologies and fifth generation (5G) technologies), Bluetooth®, ZigBee®, long-range wide area network (LoRaWAN), satellite communication, radio frequency identification (RFID), 6G (sixth generation) networks, advanced IoT protocols, mesh networks, non-terrestrial networks (NTNs), near field communication (NFC), and the like. In an exemplary embodiment the one or more sensors 108 are operatively connected to a surveillance vehicle. The surveillance vehicle may include, but not limited to, a four-wheeler automobile, a two-wheeler automobile, an unmanned aerial vehicle (UAV), and the like. The one or more sensors 108 comprise, but not constrained to, at least one of: one or more complementary metal oxide semiconductor (CMOS) sensors, one or more depth sensors, one or more light detection and ranging (LiDAR) sensors, one or more inertial measurement unit (IMU) sensors, one or more ultrasonic sensors, one or more radio detection and ranging (Radar) sensors, one or more gyroscope sensors, one or more accelerometer sensors, one or more sound navigation and ranging (Sonar) sensors, ground-penetrating radar (GPR), one or more red-green-blue (RGB) sensors, one or more cameras, and the like. In an exemplary embodiment, the one or more CMOS sensors are configured to capture high-resolution images and videos with low power consumption, commonly used in the one or more cameras. The one or more depth sensors are configured to measure the distance between objects and the one or more depth sensors to create depth maps for at least one of: three-dimensional (3D) modelling and obstacle detection. The one or more LiDAR sensors uses laser beams to measure distances and generate highly accurate 3D maps of the surroundings. The one or more IMU sensors combines accelerometers and gyroscopes to measure motion, orientation, and velocity. The one or more ultrasonic sensors emits high-frequency sound waves to detect object distances by measuring the reflected sound. The one or more Radar is configured to use radio waves to detect the range, velocity, and angle of objects in the roadways. The one or more gyroscope sensors are configured to measure rotational motion and orientation changes around an axis for stability and navigation. The one or more accelerometer sensors are configured to detect changes in motion or acceleration along one or more axes. The one or more Sonar sensors are configured to use sound waves to detect underwater or air-based objects and their distances. The one or more GPR are configured to emits electromagnetic waves to detect subsurface anomalies like cracks or voids beneath the road surface. The one or more RGB sensors are configured to capture colour data in the visible spectrum for high-quality imaging. The one or more cameras are configured to provide visual data in the form of at least one of: images and videos for analysis and monitoring of the roadways. In an exemplary embodiment, the one or more sensors 108 are powered independently through the software application interfacing with the GPU-based processing unit 214. For instance, universal serial bus (USB)-based sensors, such as the one or more GPS, derive their power from the USB interface of the central GPU -based processing unit 214. Upon powering on, the computer-implemented system 102 running on the central unit initializes and manages the roadway data acquisition from each sensor 108 of the one or more sensors 108 either through a polling mechanism or by issuing a signal and awaiting an event. The initialization and data-fetching mechanisms vary depending on the type of sensor 108 of the one or more sensors 108. The computer-implemented system 102 initiates the one or more GPS sensors by signalling an “initialize” event. The computer-implemented system 102 is configured to poll the one or more GPS sensors for geo-coordinates until valid (non-zero) coordinates are received. Once valid coordinates are obtained, the one or more GPS sensors are marked as “initialized.” Subsequently, the computer-implemented system 102 begins fetching the roadway data, including the one or more timestamps and geospatial coordinates data, from the one or more GPS sensors. However, data acquisition starts only upon receiving a “start” command from the graphical user interface unit. The computer-implemented system 102 creates distinct objects for each type of vision sensor. During initialization, the software application opens the necessary ports for communication with these sensors, depending on their interface type. For USB-based vision sensors, the application establishes channels to receive frames. For other sensors of the one or more sensors 108 using Ethernet interfaces, transport protocol ports are opened. Following this, the computer-implemented system 102 configures one or more vision sensors with parameters such as frame rate, sensitivity, frame format, resolution, and dimensions. Once configured, the one or more sensors 108 transmit data to the computer-implemented system 102 as events. While the frames are continuously received, the computer-implemented system 102 begins writing the roadway data to the one or more databases 104 and transmitting it to the one or more servers 118 in real-time only upon receiving a “start” command from the graphical user interface unit. The initialization of the one or more IMU sensors is carried out via a simple power-on sequence orchestrated by the computer-implemented system 102. The roadway data from the one or more IMU sensors is then synchronized with the frames received from the one or more vision sensors. However, the roadway data from the one or more IMU sensors is non-linear in frequency, as it is primarily generated during events such as changes in velocity or rotation (e.g., detection of potholes or speed bumps). The computer-implemented system 102 initializes the one or more sonar sensors by polling initial data to validate proper initialization. Upon receiving the valid roadway data, the sonar sensors are marked as “initialized.” The roadway data acquisition from the one or more sonar sensors operates in polling mode and begins only when the computer-implemented system 102 receives a “start” command from the graphical user interface unit. Once powered on, the one or more GPR sensors continuously transmit data in the form of multiple events. The computer-implemented system 102 begins writing this roadway data to the one or more databases 104 and transmitting it to the one or more sensors 108 in real-time only after receiving a “start” command from the graphical user interface unit. In an exemplary embodiment, the time-stamping module 108a is operatively connected to the one or more sensors 108. The time-stamping module 108a is configured to obtain the raw sensor data from the one or more sensors 108. The time-stamping module 108a is configured to provide the one or more timestamps to the raw roadway data based on an internal clock associated with each sensor of the one or more sensors 108 operated at first defined frequencies. In an exemplary embodiment the first defined frequency is range between 25 Hertz (Hz) and 30 Hz. The time-stamping module 108a is configured to provide the one or more timestamps to the geospatial coordinates data generated from a space-based radio-navigation unit is operated at second defined frequencies based on an elevated-frequency clock. In an exemplary embodiment the second defined frequencies second defined frequencies are configured to operate within a range of 1 kilohertz (kHz) to 10 kHz, providing high precision and accuracy in geospatial data acquisition. The time-stamping module 108a is configured to ensure synchronization between the raw roadway data and the geospatial coordinates data by aligning the time references from the internal clock and the elevated-frequency clock. The time-stamping module 108a leverages advanced synchronization techniques to eliminate discrepancies caused by clock drift, time lags, or variations in data acquisition rates. By doing so, the time-stamping module 108a ensures that each data point whether originating from the one or more sensors 108 or the space-based radio-navigation unit may be accurately correlated to its corresponding geospatial location and temporal instance. The space-based radio-navigation unit may comprise, but not limited to, one of: a Global Positioning System (GPS), Global Navigation Satellite System (GLONASS), Galileo, Apple® Maps, Here WeGo, NavIC (Navigation with Indian Constellation), Inertial Navigation Systems (INS), and the like. The space-based radio-navigation unit is operatively linked to the GPU-based processing unit 214 through a peripheral mechanism. In an exemplary embodiment, the computer-implemented system 102 may be implemented by way of a single device or a combination of multiple devices that may be operatively connected or networked together. The computer-implemented system 102 may be implemented in hardware or a suitable combination of hardware and software. Though few components and the plurality of subsystems 114 are disclosed in Figure 1, there may be additional components and subsystems which is not shown, such as, but not limited to, ports, routers, repeaters, firewall devices, network devices, the one or more databases 104, network attached storage devices, assets, machinery, instruments, facility equipment, emergency management devices, image capturing devices, any other devices, and combination thereof. The person skilled in the art should not be limiting the components/subsystems shown in Figure 1. Although Figure 1 illustrates the computer-implemented system 102, and the one or more communication devices 106 connected to the one or more databases 104, one skilled in the art may envision that the computer-implemented system 102, and the one or more communication devices 106 may be connected to several user devices located at various locations and several databases via the one or more communication networks 116. Those of ordinary skilled in the art will appreciate that the hardware depicted in Figure 1 may vary for particular implementations. For example, other peripheral devices such as an optical disk drive and the like, the local area network (LAN), the wide area network (WAN), wireless (e.g., wireless-fidelity (Wi-Fi)) adapter, graphics adapter, disk controller, input/output (I/O) adapter also may be used in addition or place of the hardware depicted. The depicted example is provided for explanation only and is not meant to imply architectural limitations concerning the present disclosure. Those skilled in the art will recognize that, for simplicity and clarity, the full structure and operation of all data processing systems suitable for use with the present disclosure are not being depicted or described herein. Instead, only so much of the computer-implemented system 102 as is unique to the present disclosure or necessary for an understanding of the present disclosure is depicted and described. The remainder of the construction and operation of the computer-implemented system 102 may conform to any of the various current implementations and practices that were known in the art. Figure 2A illustrates an exemplary block diagram 200A representation of the computer-implemented system 102 as shown in Figure 1 for processing the roadway data, in accordance with an embodiment of the present invention; Figure 2B illustrates an exemplary flow diagram 200B representation of the computer-implemented system 102, in accordance with an embodiment of the present invention. In an exemplary embodiment, the computer-implemented system 102 (hereinafter referred to as the system 102) comprises the one or more servers 118, the memory unit 112, and a storage unit 204. The one or more hardware processors 110, the memory unit 112, and the storage unit 204 are communicatively coupled through a system bus 202 or any similar mechanism. The system bus 202 functions as the central conduit for data transfer and communication between the one or more hardware processors 110, the memory unit 112, and the storage unit 204. The system bus 202 facilitates the efficient exchange of information and instructions, enabling the coordinated operation of the system 102. The system bus 202 may be implemented using various technologies, including but not limited to, parallel buses, serial buses, and high-speed data transfer interfaces such as, but not limited to, at least one of a: universal serial bus (USB), peripheral component interconnect express (PCIe), and similar standards. In an exemplary embodiment, the memory unit 112 is operatively connected to the one or more hardware processors 110. The memory unit 112 comprises the plurality of subsystems 114 in the form of programmable instructions executable by the one or more hardware processors 110. The plurality of subsystems 114 comprises a data packetizing subsystem 206 and a data depacketizing subsystem 208, a synchronisation subsystem 210, and a data processing subsystem 212. The one or more hardware processors 110, as used herein, means any type of computational circuit, such as, but not limited to, the microprocessor unit, microcontroller, complex instruction set computing microprocessor unit, reduced instruction set computing microprocessor unit, very long instruction word microprocessor unit, explicitly parallel instruction computing microprocessor unit, graphics processing unit, digital signal processing unit, or any other type of processing circuit. The one or more hardware processors 110 may also include embedded controllers, such as generic or programmable logic devices or arrays, application-specific integrated circuits, single-chip computers, and the like. The memory unit 112 may be the non-transitory volatile memory and the non-volatile memory. The memory unit 112 may be coupled to communicate with the one or more hardware processors 110, such as being a computer-readable storage medium. The one or more hardware processors 110 may execute machine-readable instructions and/or source code stored in the memory unit 112. A variety of machine-readable instructions may be stored in and accessed from the memory unit 112. The memory unit 112 may include any suitable elements for storing data and machine-readable instructions, such as read-only memory, random access memory, erasable programmable read-only memory, electrically erasable programmable read-only memory, a hard drive, a removable media drive for handling compact disks, digital video disks, diskettes, magnetic tape cartridges, memory cards, and the like. In the present embodiment, the memory unit 112 includes the plurality of subsystems 114 stored in the form of machine-readable instructions on any of the above-mentioned storage media and may be in communication with and executed by the one or more hardware processors 110. The storage unit 204 may be a cloud storage or the one or more databases 104 such as those shown in Figure 1. The storage unit 204 may store, but not limited to, recommended course of action sequences dynamically generated by the system 102. The action sequences comprise detailed instructions, insights, and prioritized maintenance tasks derived from the processed roadway data. For example, the action sequences may include recommendations for immediate repair of critical one or more road deformities, such as potholes or cracks, scheduling of periodic maintenance activities, alerts for potential structural risks based on predictive analysis, and optimization of resource allocation for maintenance crews. Additionally, the storage unit 204 may retain previous action sequences for comparison and future reference, enabling continuous refinement of the system 102 over time. The storage unit 204 may be any kind of database such as, but not limited to, relational databases, dedicated databases, dynamic databases, monetized databases, scalable databases, cloud databases, distributed databases, any other databases, and a combination thereof. Furthermore, the storage unit 204 may integrate with external systems, such as transportation management systems, city planning tools, and contractor platforms, to automate workflows. For example, recommended course of action sequences stored in the storage unit 204 may be exported as task orders to maintenance crews or integrated into urban development plans for better resource optimization. The storage unit 204 also serves as a repository for historical roadway data, machine learning model parameters, calibration data for the one or more sensors 108, and geospatial maps. This allows the system 102 to perform longitudinal analysis, enabling it to identify recurring patterns in roadway damage or wear and tear. For instance, the system 102 may generate action sequences that predict specific areas of the roadway prone to deterioration based on historical data trends, environmental conditions, and traffic patterns. In an exemplary embodiment, the data packetizing subsystem 206 is configured to process raw roadway data obtained from the one or more sensors 108 and organize the roadway data into one or more packets with a defined structure through a bespoke protocol. The bespoke protocol is configured to ensure the optimal and efficient transmission of the packetized roadway data to the GPU-based processing unit 214 for further analysis and processing. The packetizing of the raw roadway data by the bespoke protocol involves incorporating essential metadata and applying robust error-checking mechanisms to ensure the roadway data integrity and reliability during transmission. The packetizing process of the raw roadway data includes, embedding the metadata into the one or more packets. This metadata serves multiple purposes, such as identifying the source and type of data. The metadata comprises at least one of, but not limited to, sensor data identification, the one or more timestamps, the geospatial coordinates data, packet sequence number, and the like. The sensor data identification may be unique identifiers for each sensor 108 of the one or more sensors 108 contributing the roadway data, enabling accurate source tracking. The one or more timestamps are accurate time markers generated by the time-stamping module 108a to synchronize data from the one or more sensors 108. The geospatial coordinates data comprises location data provided by the space-based radio-navigation unit. The packet sequence number is a unique identifier for each packet, ensuring proper sequencing and facilitating the detection of lost or out-of-order packets during transmission. The one or more error-checking mechanisms comprise at least one of: cyclic redundancy check (CRC) and a checksum mechanism for ensuring the integrity of the packetized data. The CRC is a mathematical model that generates a short, fixed-length binary sequence derived from packet contents to detect any errors introduced during data transmission. The checksum mechanism is a simpler error-detection method that sums the numerical values of the packet's contents to identify potential alterations. The bespoke protocol organizes each packet of the one or more packets into three primary sections to facilitate seamless transmission and error-checking. 1. Header data: the header serves as the introductory portion of each packet, containing metadata and identifiers essential for data interpretation and transmission. The header data comprises at least one of: sensor identification data, geospatial coordinates data, the one or more timestamps, packet sequence numbers, data type identifiers, and error-checking codes for data integrity verification. 2. Payload data: the payload contains the core data being transmitted, derived from the one or more sensors 108. The payload data may include, but not limited to, at least one of: images, videos, sensor measurements, geospatial data, and the like. 3. Footer data: the footer marks the conclusion of the one or more packet and contains mechanisms for verifying packet integrity and signalling the end of transmission. footer marks the conclusion of the packet and contains mechanisms for verifying packet integrity and signalling the end of transmission The incorporation of the one or more error-checking mechanisms within both the header data and the footer data enhances the robustness of the transmission process. These one or more error-checking mechanisms ensure that any corruption or loss of data during transmission is promptly detected, enabling error recovery and retransmission of corrupted packets within the one or more packets as necessary. This feature is particularly critical in real-time roadway monitoring, where the accuracy and completeness of the roadway data are paramount. In this manner, the data packetizing subsystem 206 enables the seamless transmission of structured, error-checked roadway data to the GPU-based processing unit 214. This facilitates efficient data analysis, ensuring the integrity and reliability of the system 102's operation in various environmental conditions and use cases. The defined packet structure and bespoke protocol collectively enhance capability of the system 102 to handle large volumes of the roadway data in a robust, efficient, and scalable manner. In an exemplary embodiment, the data depacketizing subsystem 208 is configured to process each received packet from the one or more packets by reconstructing the raw roadway data into its original form based on the defined structure utilized during packetization. The reconstructed roadway data is then stored in one or more databases 104 for further analysis, processing, or retrieval by other subsystems or external interfaces. The depacketizing subsystem 208 ensures that the integrity of the roadway data is maintained throughout the process and that any disruptions during transmission are effectively handled. The data depacketizing subsystem 208 begins by validating each received packet through a validation module 208a, which operates using the one or more error-checking mechanisms embedded within each packet of the defined structure. This ensures that only uncorrupted and complete one or more packets are considered for reconstruction. The validation process comprises a) The CRC value in the received packet is recalculated and compared with the CRC value in the packet header/footer. Any mismatch indicates data corruption, and each packet is flagged for retransmission or error logging, and b) the checksum embedded in each packet is verified by summing up the contents of the received packet. If the calculated checksum does not match the embedded value, each packet is considered invalid. The validation module 208a ensures that the metadata are intact and unaltered. The footer is checked for the presence of valid markers that indicate transmission completion. The invalid packets are logged and, if necessary, a request for retransmission is generated to recover missing or corrupted data. This process ensures that no incomplete or erroneous data enters the reconstruction stage. Once the packet is validated, the reconstructing module 208b begins the process of combining data from the one or more packets to recreate the original raw roadway data. The reconstructing module 208b leverages metadata to reorder the one or more packets that may have been received out of sequence. This ensures the data's chronological and logical coherence. The geospatial coordinates data and the one or more timestamps are also used to synchronize the roadway data from the one or more sensors 108. The one or more packets received in an incorrect order due to network delays or transmission disruptions are rearranged based on the packet sequence numbers. Missing packets are flagged, and the system 102 may either proceed with partial data or wait for retransmission, depending on the system’s configuration and requirements for real-time or batch processing. The reconstructing module 208b is configured to extract the payload data from each packet. The extracted payload data is then combined to recreate the original roadway data in a format usable by downstream subsystems, such as the data processing subsystem 212 or external analytics tools. If errors are detected in the payload, the system 102 logs the errors, and one or more error-checking mechanisms are triggered for error recovery. These one or more error-checking mechanisms include one of: requesting retransmission and interpolating missing data, depending on the application's requirements. The fully reconstructed roadway data is stored in the one or more databases 104, which may include local storage or cloud-based storage systems. The data is stored in a structured format, enabling efficient querying and retrieval by other components of the system 102, such as the data processing subsystem 212 and synchronization subsystem 210. In an exemplary embodiment, the synchronisation subsystem 210 is configured to ensure precise and efficient synchronisation of the reconstructed roadway data from the one or more sensors 108, by associating the roadway data with the one or more timestamps against geospatial coordinates data. This synchronisation ensures seamless alignment of diverse data streams, enabling their effective integration for downstream processing and analysis. The synchronisation subsystem 210 employs a proprietary synchronisation model, leveraging advanced techniques such as, but not limited to, at least one of: linear interpolation, clock drift compensation, and multi-threaded processing, to achieve millisecond-level synchronisation of the roadway data. The proprietary synchronisation model addresses the challenge of temporal and spatial alignment of the data with varying frequencies, the one or more timestamps, and sources by employing multiple methods. The proprietary synchronisation model is configured to employ at least one of, but not limited to, a linear interpolation process, a clock drift compensation, and a multi-threaded processing technique for millisecond-level synchronisation of the roadway data. The linear interpolation process is configured to ensure that the roadway data from the one or more sensors 108 operating at varying frequencies is interpolated to align the one or more timestamps effectively. The clock drift compensation is configured to facilitate parallel processing of synchronisation tasks, ensuring efficient and real-time operation without compromising accuracy. The synchronisation subsystem 210 aligns the one or more timestamps associated with the second defined frequencies (e.g., geospatial data generated at elevated frequencies) with the one or more timestamps associated with the first defined frequencies (e.g., the roadway data generated by vision sensors or IMU sensors) using at least one of: interpolating, periodic recalibration of clocks, dual layer synchronisation, and the like. In an exemplary embodiment, the one or more timestamps alignment is achieved by applying the linear interpolation process. For example, each vision frame timestamp operating at the first defined frequencies is interpolated using at least two adjoining geospatial coordinates to establish a consistent temporal relationship between the roadway data and the geospatial coordinates data. For instance, the geospatial coordinates data is received at 25Hz (40ms interval) from the space-based radio-navigation unit, which is time stamped at the central unit. The frames associated with one or more vision sensors are received at 30 frames per second (fps) (frames spaced at ~33.3ms interval) or 60 fps (frames spaced at ~16.67ms interval). Due to mismatch between the frequency of the geospatial coordinates data and the frames, linear interpolation is used to align and timestamp the frames. This is done by finding the two closest GPS samples around the time of the sensor reading, T1 and T2. Now, the GPS data G(T), is interpolated at the desired sensor timestamp T using Equation 1. Equation 1: G(T)=G(T1)+((T2-T1))/((T-T1))×(G(T2)-G(T1)) Here, the microsecond level granularity system clock of the central unit facilitates in precise estimation of the one or more timestamps, at which the frames are received. This way, one or more timestamps of the geospatial coordinates data is generated at the corresponding exact time of high-rate sensor reading. For high-frequency synchronisation requirements, the synchronisation subsystem 210 employs multi-threading techniques to concurrently process the roadway data and geospatial coordinates data, reducing processing delays and enabling real-time synchronisation. The synchronisation subsystem 210 is configured to perform periodic recalibration to ensure temporal accuracy by adjusting clocks to a master clock associated with the GPU-based processing unit 214. The internal clock of the one or more sensors 108 operating at the first defined frequencies. The elevated-frequency clock of space-based radio-navigation unit generating geospatial coordinates data at the second defined frequencies. Further, the recalibration process is initiated when the clock drift exceeds a predefined tolerance threshold. For instance, if the drift between the internal clock and master clock surpasses ?t, recalibration is triggered. The recalibration aligns the internal and elevated-frequency clocks with the master clock to maintain consistency across the system. The synchronisation subsystem 210 employs a dual-layer synchronisation mechanism to achieve precise alignment between the roadway data and the geospatial coordinates data. This dual-layer synchronisation mechanism comprises the following layers: hardware-based initial synchronisation and instruction-based secondary synchronisation. The hardware-based initial synchronisation is configured utilize global positioning system (GPS) signals for hardware-level synchronisation. The hardware-based initial synchronisation is configured to ensure microsecond-level synchronisation accuracy by aligning sensor clocks with GPS signals to create a global reference for the one or more sensors 108. The instruction-based secondary synchronisation is configured to apply the linear interpolation process to further refine the synchronisation between the roadway data and the geospatial coordinates data. The instruction-based secondary synchronisation is configured to implement clock drift compensation to correct any remaining temporal deviations introduced during data transmission or processing. By employing a combination of interpolation, multi-threading, and recalibration, the proprietary synchronisation model achieves millisecond-level accuracy, ensuring precise temporal alignment of roadway data. The integration of multi-threaded processing techniques allows the synchronisation subsystem 210 to handle high-frequency data streams in real time, making it suitable for dynamic roadway monitoring applications. The periodic recalibration process and dual-layer synchronisation mechanism mitigate the effects of clock drift and transmission delays, enhancing the reliability of the synchronised data. The synchronisation subsystem 210 is configured to handle the roadway data and varying data frequencies, making it adaptable to diverse roadway monitoring scenarios. In an exemplary embodiment, the data processing subsystem 212 is configured to perform advanced processing on the synchronised roadway data to detect and analyse the one or more road deformities. The data processing subsystem 212 is configured to employ sophisticated computational models, including at least one of: one or more AI models and one or more ML models, for extracting actionable insights from the roadway data. The data processing subsystem 212 integrates various functionalities such as model-based detection, post-processing, and analytics, enabling comprehensive roadway condition analysis and reporting. The data processing subsystem 212 utilizes at least one of: AI models and ML models, including deep learning models, to detect the one or more road deformities in the synchronised roadway data. The deep learning model employed is a Single Shot Multibox Detector (SSD) framework with a MobileNetV2 backbone. The SSD framework is optimized for real-time object detection, making it suitable for detecting varying road deformities with high accuracy and efficiency. The SSD framework is trained on one or more structured datasets, which comprise regional roadways data annotated with at least one of: one or more characteristics and one or more classifications. The one or more characteristics are the specific attributes of regional roadways, such as surface type, texture, and lane markings. The one or more classifications are the specific attributes of regional roadways, such as varying road deformities (e.g., potholes, cracks, and uneven surfaces), environmental challenges (e.g., lighting conditions, weather variations), and construction materials (e.g., asphalt, concrete). The one or more structured datasets ensure the robustness and adaptability of at least one of: one or more AI models and one or more ML models for diverse roadway scenarios. The synchronised roadway data and data associated with the detected one or more road deformities are stored in the one or more databases 104. The storage is structured to facilitate efficient retrieval, post-processing, and analysis of both raw and processed data. The stored data serves as a repository for further insights, reporting, and archival purposes. After detection, the data processing subsystem 212 is configured to apply a post-processing technique known as non-maximum suppression (NMS) to refine the detection outputs. The NMS is configured to eliminate redundant or overlapping one or more bounding boxes detected by at least one of: the one or more AI models and the one or more ML models, ensuring that only the most accurate bounding boxes representing road deformities are retained. In an exemplary embodiment, the process of NMS includes each bounding box of the one or more bounding boxes is assigned a confidence score. The one or more bounding boxes with lower confidence scores that overlap significantly with higher confidence boxes are suppressed. The result is a refined set of bounding boxes that precisely outline the detected one or more road deformities. The one or more roadway deformities including, but not limited to, potholes, cracks, surface wear, rutting, shoving, cross drainage (CD) structure blockages, road surface deformities, road surface depressions, crust deformities, inconsistencies in bitumen crust density, and the like. The data processing subsystem 212 includes a data analytics engine 216 integrated with the GUI unit 214 to generate comprehensive insights and outputs based on the post-processed roadway data. The data analytics engine 216 is configured to generate at least one of: an analysis output data, an inspection dataset, detection summary data, diagnostic reports, roadway condition profiles, and visual representation data, based on the post-processed roadway data. The GUI unit provides an interactive platform for visualizing and analysing roadway data. The one or more users are able to access detailed insights, adjust analysis parameters, and generate customized reports for specific use cases. The analysis output data comprises summarized detection results, including, but not limited to, deformity types, sizes, locations, and the like. The inspection dataset may comprise curated datasets for manual or automated inspection processes. The detection summary data may comprise aggregated metrics and statistics on detected road deformities. The diagnostic reports may include detailed reports highlighting roadway condition and potential repair requirements. The roadway condition profiles may comprise high-level profiles of road quality across regions, indicating areas requiring maintenance. The visual representation data may comprise graphical overlays of detected road deformities on images or maps for intuitive visualization. In an exemplary embodiment, the SSD framework with the MobileNetV2 backbone is trained to detect and classify the one or more road deformities based on images captured by the one or more vision sensors or the one or more cameras. The SSD framework utilizes a single deep neural network for object detection and prediction. The SSD framework discretizes the output space of the one or more bounding boxes into a set of default boxes over different aspect ratios and scales per feature map location, allowing for effective detection of road damage of varying sizes and shapes. While the SSD framework is customized and optimized by training on the one or more structured datasets comprises a plurality of images of regional roads to ensure region-specific accuracy and performance. To train the SSD framework, the one or more structured datasets of 5,000 annotated images was prepared. The data preparation included preprocessing techniques such as resizing, normalization, and data augmentation (e.g., flipping, rotation) to improve generalization capabilities of the SSD framework across diverse conditions. The SSD framework integrates the MobileNetV2 as a lightweight feature extractor, complemented by additional layers configured for efficient object detection and prediction of the one or more bounding boxes. The training process is conducted using a combination of hyperparameter optimization and advanced techniques, including the following parameters: Learning rate: 0.001, Batch size: 16, and Number of epochs: 50. The SSD framework employed an Adam optimizer to ensure efficient convergence during training and utilized a combined loss function that incorporates both localization loss and confidence loss for improved performance. Post-training evaluation is conducted on a test set of 1,000 images, yielding a notable detection accuracy of approximately 85%. The evaluation metrics included mean average precision (MAP) and Intersection over Union (IoU), with a threshold of 0.5 for object localization. The SSD framework demonstrated robustness in detecting both large defects (e.g., significant potholes) and small defects under normal conditions. However, challenges are observed in detecting smaller defects under poor lighting conditions. To address these limitations, complementary data from other sensors of the one or more sensors 108, are integrated to enhance overall detection accuracy and reliability. The SSD framework training of the synchronized roadway data facilitates deployment of the diverse datasets of the roadway data. The datasets of the roadway data are processed and stored into the one or more databases 104. The results of the SSD framework contribute significantly to the overall efficiency of the system 102 by integrating detected roadway data into the synchronized roadway dataset. This enables seamless post-processing and analytics through the data processing subsystem 212, ensuring that detailed diagnostic reports, roadway condition profiles, and actionable insights are generated effectively. In an exemplary embodiment, the determination of the relative position and orientation of the one or more sensors 108 associated with the system 102 is achieved through the use of the one or more IMUs. The one or more IMUs comprises an accelerometer sensor and a gyroscopic sensor, which together provide continuous navigational data in the form of x/y/z coordinates. The navigational data is transmitted in the XYZ32F format, where x, y, and z coordinates are represented as 32-bit floating-point data. The system 102 processes this navigational data to estimate rotational movements, denoted as theta (?), which represents the angular orientation of the one or more sensors 108. The accelerometer sensor provides x, y, and z coordinates, which are processed to calculate the acceleration angles and the rotational angles (theta). The following computations are performed: i. Acceleration angle z (a.z) = tan (y, z) ii. Acceleration angle x (a.x) = tan (x, v((y^2 + z^2))) iii. rotation theta x (t.x) = x * a+ a.x * (1-a). iv. rotation theta z (t.z) = z * a+ a.z * (1-a). v. rotation theta y (t.y) = p (3.14159) To compute theta accurately, a complementary filter is applied to blend the high-frequency signals from the gyroscope with the low-frequency signals from the accelerometer. The filter works as follows: High-pass filter=?·a This allows short-duration signals to pass through while filtering out steady-state signals, effectively cancelling drift. Low-pass filter (accelerometer component): Low-pass filter=acceleration·(1-a) This passes long-term changes while filtering out short-term fluctuations. The filter ensures that the final computation of theta balances the drift-prone gyro data with the disturbance-sensitive accelerometer data. The value of alpha (where 0

Documents

Application Documents

# Name Date
1 202311079927-STATEMENT OF UNDERTAKING (FORM 3) [24-11-2023(online)].pdf 2023-11-24
2 202311079927-PROVISIONAL SPECIFICATION [24-11-2023(online)].pdf 2023-11-24
3 202311079927-FORM FOR STARTUP [24-11-2023(online)].pdf 2023-11-24
4 202311079927-FORM FOR SMALL ENTITY(FORM-28) [24-11-2023(online)].pdf 2023-11-24
5 202311079927-FORM 1 [24-11-2023(online)].pdf 2023-11-24
6 202311079927-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [24-11-2023(online)].pdf 2023-11-24
7 202311079927-EVIDENCE FOR REGISTRATION UNDER SSI [24-11-2023(online)].pdf 2023-11-24
8 202311079927-DRAWINGS [24-11-2023(online)].pdf 2023-11-24
9 202311079927-APPLICATIONFORPOSTDATING [22-11-2024(online)].pdf 2024-11-22
10 202311079927-FORM-26 [26-11-2024(online)].pdf 2024-11-26
11 202311079927-FORM-5 [23-01-2025(online)].pdf 2025-01-23
12 202311079927-FORM-26 [23-01-2025(online)].pdf 2025-01-23
13 202311079927-FORM FOR SMALL ENTITY [23-01-2025(online)].pdf 2025-01-23
14 202311079927-FORM 3 [23-01-2025(online)].pdf 2025-01-23
15 202311079927-EVIDENCE FOR REGISTRATION UNDER SSI [23-01-2025(online)].pdf 2025-01-23
16 202311079927-DRAWING [23-01-2025(online)].pdf 2025-01-23
17 202311079927-CORRESPONDENCE-OTHERS [23-01-2025(online)].pdf 2025-01-23
18 202311079927-COMPLETE SPECIFICATION [23-01-2025(online)].pdf 2025-01-23