Abstract: CANCER DETECTION SYSTEM AND METHOD ABSTRACT A cancer detection system (100) and method (300) for detecting lung and colon cancer using cloud-edge computing and deep learning are disclosed. The cancer detection system (100) comprises a gateway device (102) adapted to collect histopathological image data from a patient. The cancer detection system (100) further comprises an edge device (106) in communication with the gateway device (102). The cancer detection system (100) further comprises a cloud server (110) communicatively coupled with the edge device (106). The cancer detection system (100) receives the classified results from the edge device (106), performs further analysis and storage of the results, provide remote access to the results for clinicians, and displays real-time cancer diagnosis results to the clinicians using an output interface (104) associated with the gateway device (102). The cancer detection system (100) reduces human error and improves accuracy in lung and colon cancer detection. Claims: 10, Figures: 4 Figure 1A is selected.
Description:BACKGROUND
Field of Invention
[001] Embodiments of the present invention generally relate to a cancer detection system and particularly to a cancer detection system for detecting lung and colon cancer using cloud-edge computing and deep learning.
Description of Related Art
[002] Cancer of lung and colon remains a major global health concern, causing significant mortality and morbidity. Early detection of these cancers is critical for improving patient survival rates, yet accurate diagnosis at early stages remains difficult in many clinical environments. Accurate diagnosis typically relies on the interpretation of histopathological images, that requires expert judgment and substantial time. Variability in interpretation across pathologists and institutions adds to the challenge of achieving consistent diagnostic outcomes.
[003] Several technological advancements have sought to address these issues through development of computer-aided diagnostic tools and artificial intelligence systems. Systems in this area often utilize machine learning and image processing techniques to assist clinicians in identifying cancerous tissue. Prominent systems include platforms from major healthcare technology providers and research organizations that offer diagnostic assistance for oncology applications. These systems aim to support decision-making and reduce diagnostic errors in clinical practice.
[004] Despite these developments, existing systems continue to exhibit significant limitations. Many systems rely on models trained with small or imbalanced datasets, that affects their ability to generalize across diverse patient populations. Certain systems still incorporate manual feature extraction steps, that introduces possibility of errors. Additionally, some systems require high computational resources, that restricts their deployment in resource-limited settings. Furthermore, real-time diagnostic feedback is often unavailable, that delays clinical decision-making and impacts timely intervention.
[005] There is thus a need for an improved and advanced cancer detection system for detecting lung and colon cancer using cloud-edge computing and deep learning that can administer the aforementioned limitations in a more efficient manner.
SUMMARY
[006] Embodiments in accordance with the present invention provide a cancer detection system for detecting lung and colon cancer using cloud-edge computing and deep learning. The cancer detection system comprising a gateway device adapted to collect histopathological image data from a patient. The cancer detection system further comprising an edge device in communication with the gateway device. The edge device is configured to preprocess the collected image data to enhance image quality, and execute a customized convolutional neural network (CNN) model to extract features and classify the image data into one or more cancer stages. The cancer detection system further comprising a cloud server communicatively coupled with the edge device. The cloud server is configured to receive the classified results from the edge device; perform further analysis and storage of the results; provide remote access to the results for clinicians; and display real-time cancer diagnosis results to the clinicians using an output interface associated with the gateway device.
[007] Embodiments in accordance with the present invention further provide a method for detecting lung and colon cancer using a cancer detection system. The method comprising steps of collecting histopathological image data from a patient through a gateway device; preprocessing the collected image data to enhance image quality; transmitting the pre-processed image data to an edge device; executing a customized convolutional neural network (CNN) model on the edge device to extract features corresponding to cancer stages progression; classifying the image data into the one or more cancer stages based on the extracted features; transmitting classification results to a cloud server for further analysis, storage, and remote accessibility; and providing a cancer diagnosis output in real time to clinicians using an output interface associated with the gateway device.
[008] Embodiments of the present invention may provide a number of advantages depending on their particular configuration. First, embodiments of the present application may provide a cancer detection system for detecting lung and colon cancer using cloud-edge computing and deep learning.
[009] Next, embodiments of the present application may provide a cancer detection system that provides automated analysis of histopathological images using a customized Convolutional Neural Network (CNN) model.
[0010] Next, embodiments of the present application may provide a cancer detection system that reduces human error and improves accuracy in lung and colon cancer detection.
[0011] Next, embodiments of the present application may provide a cancer detection system that enables portable and low-power cancer detection.
[0012] Next, embodiments of the present application may provide a cancer detection system that is suitable for deployment in remote or resource-constrained environments.
[0013] Next, embodiments of the present application may provide a cancer detection system that allows real-time data transfer and remote accessibility.
[0014] Next, embodiments of the present application may provide a cancer detection system that ensures faster diagnostic processes and enabling telemedicine capabilities.
[0015] Next, embodiments of the present application may provide a cancer detection system that is optimized for low power consumption and efficient inference.
[0016] Next, embodiments of the present application may provide a cancer detection system that reduces operational costs and eliminates need for high-performance computing infrastructure.
[0017] Next, embodiments of the present application may provide a cancer detection system that features combined use of cloud and edge computing provides scalability and rapid processing.
[0018] Next, embodiments of the present application may provide a cancer detection system that ensures timely diagnosis and improved patient outcomes without compromising system performance.
[0019] These and other advantages will be apparent from the present application of the embodiments described herein.
[0020] The preceding is a simplified summary to provide an understanding of some embodiments of the present invention. This summary is neither an extensive nor exhaustive overview of the present invention and its various embodiments. The summary presents selected concepts of the embodiments of the present invention in a simplified form as an introduction to the more detailed description presented below. As will be appreciated, other embodiments of the present invention are possible utilizing, alone or in combination, one or more of the features set forth above or described in detail below.
BRIEF DESCRIPTION OF THE DRAWINGS
[0021] The above and still further features and advantages of embodiments of the present invention will become apparent upon consideration of the following detailed description of embodiments thereof, especially when taken in conjunction with the accompanying drawings, and wherein:
[0022] FIG. 1A illustrates a block diagram of a cancer detection system for detecting lung and colon cancer using cloud-edge computing and deep learning, according to an embodiment of the present invention;
[0023] FIG. 1B illustrates an exemplary setup of a cancer detection system for detecting lung and colon cancer using cloud-edge computing and deep learning, according to an embodiment of the present invention;
[0024] FIG. 2 illustrates cancer stages, according to an embodiment of the present invention; and
[0025] FIG. 3 depicts a flowchart of a method for detecting lung and colon cancer using a cancer detection system, according to an embodiment of the present invention.
[0026] The headings used herein are for organizational purposes only and are not meant to be used to limit the scope of the description or the claims. As used throughout this application, the word "may" is used in a permissive sense (i.e., meaning having the potential to), rather than the mandatory sense (i.e., meaning must). Similarly, the words “include”, “including”, and “includes” mean including but not limited to. To facilitate understanding, like reference numerals have been used, where possible, to designate like elements common to the figures. Optional portions of the figures may be illustrated using dashed or dotted lines, unless the context of usage indicates otherwise.
DETAILED DESCRIPTION
[0027] The following description includes the preferred best mode of one embodiment of the present invention. It will be clear from this description of the invention that the invention is not limited to these illustrated embodiments but that the invention also includes a variety of modifications and embodiments thereto. Therefore, the present description should be seen as illustrative and not limiting. While the invention is susceptible to various modifications and alternative constructions, it should be understood, that there is no intention to limit the invention to the specific form disclosed, but, on the contrary, the invention is to cover all modifications, alternative constructions, and equivalents falling within the scope of the invention as defined in the claims.
[0028] In any embodiment described herein, the open-ended terms "comprising", "comprises”, and the like (which are synonymous with "including", "having” and "characterized by") may be replaced by the respective partially closed phrases "consisting essentially of", “consists essentially of", and the like or the respective closed phrases "consisting of", "consists of”, the like.
[0029] The term “patient” refers to an individual from whom biological samples, medical data, or histopathological image data are collected for the purpose of disease diagnosis, monitoring, or treatment. The patient can include, without limitation, a human subject of any demography under clinical examination or medical observation.
[0030] The term “clinician” refers to a medical professional, including but not limited to a physician, an oncologist, a pathologist, or a healthcare practitioner, who is authorized to interpret diagnostic results, provide medical consultation, and perform disease management based on the output of the system or method described herein.
[0031] As used herein, the singular forms “a”, “an”, and “the” designate both the singular and the plural, unless expressly stated to designate the singular only.
[0032] FIG. 1A illustrates a block diagram of a cancer detection system 100 for detecting lung and colon cancer using cloud-edge computing and deep learning, according to an embodiment of the present invention. In an embodiment of the present invention, the cancer detection system 100 may gather patient data. Further, the cancer detection system 100 may execute computational programs and algorithms for analysing the patient data to extrapolate presence of the lung and colon cancer. The cancer detection system 100 may optimize for low power consumption and quicker edge-based inference. The cancer detection system 100 may offer portable, real-time cancer diagnosis utilizing histopathology images while executing on a tailored convolutional neural network (CNN) model. The cancer detection system 100 streamlines a cancer detection process and ensures responsiveness and scalability. Thus, ultimately leading to improved patient outcomes through timely and accurate diagnoses.
[0033] According to the embodiments of the present invention, the cancer detection system 100 may incorporate non-limiting hardware components to enhance a processing speed and an efficiency such as the cancer detection system 100 may comprise a gateway device 102, an output interface 104, an edge device 106, a fog computing environment 108, and a cloud server 110. In an embodiment of the present invention, the hardware components of the cancer detection system 100 may be integrated with computer-executable instructions for overcoming the challenges and the limitations of the existing cancer detection systems.
[0034] In an embodiment of the present invention, the gateway device 102 may be adapted to collect the histopathological image data from a patient. The collected histopathological image may enable a diagnosis of human tissues for detection of early signs of diseases. The gateway device 102 may be, but not limited to, a smartphone, a tablet, a wearable device, and so forth. Embodiments of the present invention are intended to include or otherwise cover any type of the gateway device 102, including known, related art, and/or later developed technologies.
[0035] The gateway device 102 may further comprise the output interface 104. The output interface 104 may be adapted to display real-time cancer diagnosis results based on the collected histopathological image. The displayed real-time cancer diagnosis result may be viewed by clinicians. The output interface 104 may further enable the clinicians to print, upload, share, and so forth, the displayed real-time cancer diagnosis result.
[0036] In an embodiment of the present invention, the edge device 106 may be in communication with the gateway device 102. The edge device 106 may comprise processing engines. The processing engines may be, but not limited to, a Clouds Controller engine, a Service Director engine, a Protection Supervisor engine, an Information Director engine, a Service Observe engine, and so forth. Embodiments of the present invention are intended to include or otherwise cover any type of the processing engines, including known, related art, and/or later developed technologies.
[0037] The Clouds Controller engine may be configured to manage connectivity between the edge device 106 and one or more cloud platforms. It may facilitate workload offloading, synchronize data, handle virtualization, and optimize cloud resource utilization. The Service Director engine may be configured to orchestrate service delivery across the edge and gateway devices. The Service Director engine may further be configured to allocate resources, manage service-level agreements (SLAs), initiate or terminate services, and dynamically adapt to user or application requirements. The Protection Supervisor engine may be configured to monitor security-related aspects of the edge device 106. The Protection Supervisor engine may further be configured to enforce access control policies, detect anomalies, supervise data encryption/decryption, and respond to threats in real time. The Information Director engine may be configured to manage data collection, classification, and routing. The Information Director engine may further be configured to prioritize information flow, direct data to relevant applications or storage systems, and optimize bandwidth usage. The Service Observe engine may be configured to provide real-time monitoring of service performance. The Service Observe engine may further be configured to measure latency, throughput, reliability, and compliance with quality of service (QoS) requirements, while generating alerts or reports for corrective action. Collectively, these engines may be configured to operate in an integrated manner such that the Clouds Controller engine enables scalable connectivity, the Service Director engine coordinates efficient delivery, the Protection Supervisor engine secures the environment, the Information Director engine manages intelligent data flow, and the Service Observe engine ensures continuous monitoring. In combination, these engines may provide a unified framework that enables reliable, secure, and optimized service execution at the edge device 106.
[0038] The edge device 106 may be configured to preprocess the collected image data to enhance image quality. The image quality may be enhanced by performing preprocessing techniques. The preprocessing techniques may be, but not limited to, noise reduction, normalization, image resizing, contrast enhancement, and so forth. Embodiments of the present invention are intended to include or otherwise cover any type of the preprocessing techniques, including known, related art, and/or later developed technologies.
[0039] The edge device 106 may be configured to execute a customized convolutional neural network (CNN) model to extract features and patterns. Based on the extracted features and patterns, the customized convolutional neural network (CNN) model may classify the image data into one or more cancer stages 200 (as shown in FIG. 2). The classification may be conducted using a deep learning algorithm of the customized convolutional neural network (CNN) model. Further based on the classification, the edge device 106 may be configured to identify likelihood of the cancer stages 200. In an embodiment of the present invention, cancer stages 200 may further be explained in detail in conjunction with the FIG. 2.
[0040] The convolutional neural network (CNN) model may be optimized through a hyperparameter tuning to improve classification accuracy. Further, the hyperparameter tuning may enhance performance of the convolutional neural network (CNN) model. The hyperparameter tuning may lower mistakes and may raise an overall classification accuracy of the convolutional neural network (CNN) model.
[0041] In an embodiment of the present invention, the edge device 106 may be configured to operate in the fog computing environment 108. The fog computing environment 108 may be configured to reduce latency and enable faster analysis. Additionally, the faster analysis may be enabled by analysing the image data locally, closer to the gateway device 102.
[0042] In an embodiment of the present invention, the cloud server 110 communicatively coupled with the edge device 106. The cloud server 110 may be configured to receive the classified results from the edge device 106. The cloud server 110 may be configured to perform further analysis and storage of the results.
[0043] The further analysis may be, but not limited to, validating the classification results using secondary machine learning or statistical models, generating confidence scores associated with the identified cancer stages 200, correlating the classified results with additional patient data, such as demographic information, medical history, and laboratory reports stored on the cloud server 110, performing predictive analytics to determine disease progression trends, and so forth. The further analysis may additionally include anomaly detection to identify inconsistencies or errors in the classification, as well as aggregating and processing data from multiple patients for clinical research or model optimization. Such operations enable enhanced diagnostic reliability, support decision-making for clinicians, and facilitate continuous improvement of the cancer detection system 100.
[0044] The cloud server 110 may be configured to provide remote access to the results for the clinicians. The cloud server 110 may be configured to display the real-time cancer diagnosis results to the clinicians using the output interface 104 associated with the gateway device 102. In an embodiment of the present invention, the real-time cancer diagnosis results displayed to the clinician on the output interface 104 may comprise one or more diagnostic indicators generated by the cancer detection system 100.
[0045] The real-time cancer diagnosis results may be, but not limited to, identified cancer stages 200, a confidence score or probability value associated with the classification, a graphical representation of the analysed histopathological image highlighting regions of interest, comparative analysis with previously stored patient records or reference datasets, and recommendations for further clinical evaluation based on the detected cancer stage. The result may further include a summary report comprising patient identification details, date and time of analysis, and statistical data derived from aggregated cases to assist in decision-making. In certain embodiments, the result may also incorporate an alert or notification for urgent cases requiring immediate clinical attention.
[0046] In an embodiment of the present invention, the cloud server 110 may be remotely located. In an exemplary embodiment of the present invention, the cloud server 110 may be a public cloud server. In another exemplary embodiment of the present invention, the cloud server 110 may be a private cloud server. In yet another embodiment of the present invention, the cloud server 110 may be a dedicated cloud server. The cloud server 110 maybe, but not limited to, a Microsoft Azure cloud server, a Google Compute Engine (GCE) cloud server, an Amazon Elastic Compute Cloud (EC2) cloud server, and so forth. In a preferred embodiment of the present invention, the cloud server 110 may be an Amazon Web Services (AWS) cloud server. Embodiments of the present invention are intended to include or otherwise cover any type of the cloud server 110 including known, related art, and/or later developed technologies.
[0047] FIG. 1B illustrates an exemplary setup of the cancer detection system 100, according to an embodiment of the present invention. In an exemplary embodiment, a 55-year-old patient ‘P’ may be undergoing a biopsy for suspected colon cancer, by a doctor ‘D’. The patient ‘P’ may provide the histopathological images via a smartphone functioning as the gateway device 102. The image data may be transmitted to the edge device 106 installed at a diagnostic center, where preprocessing techniques such as noise reduction and normalization may be applied. The edge device 106 may execute the customized convolutional neural network (CNN) model. The edge device 106 may be configured to operate in the fog computing environment 108. The customized convolutional neural network (CNN) model may classify the image as Stage 2 cancer with an associated confidence score of 94%. The classified result, along with an annotated image highlighting abnormal tissue regions and a summary report, may be uploaded to a cloud server 110. The cloud server 110 may validate the result, correlates it with a medical history of the patient stored in the cloud server 110, and may generate predictive insights regarding disease progression. The real-time cancer diagnosis results may be displayed in real time on the output interface 104 operated by the clinicians, enabling timely treatment planning.
[0048] FIG. 2 illustrates the cancer stages 200, according to an embodiment of the present invention. In an embodiment of the present invention, cancer stages 200 may comprise a stage 0 cancer 202, a stage 1 cancer 204, a stage 2 cancer 206, a stage 3 cancer 208, and a stage 4 cancer 210.
[0049] The stage 0 cancer 202 may represent carcinoma in situ, wherein abnormal cells are present but confined to the layer of origin without invasion into neighboring tissues. This stage may be characterized by early detection markers and high treatment success rates. The stage 1 cancer 204 may represent localized cancer wherein malignant cells have invaded nearby tissues but remain limited in size and scope. The spread may not involve lymph nodes or distant regions. Treatment at this stage may include surgical removal, localized therapy, or targeted interventions. The stage 2 cancer 206 may represent a progression where tumors increase in size and may spread to nearby lymph nodes, but not to distant organs. This stage may require a combination of treatments including surgery, radiation therapy, and chemotherapy to prevent further progression. The stage 3 cancer 208 may represent an advanced local spread of cancer wherein tumors are larger, have invaded surrounding structures, and may involve multiple lymph nodes. This stage may be characterized by more aggressive treatment plans including systemic therapy, multi-modal approaches, and intensive monitoring. The stage 4 cancer 210 may represent metastatic cancer wherein malignant cells spread beyond the primary site to distant organs such as the lungs, liver, brain, or bones. This stage may require advanced therapies such as immunotherapy, precision medicine, or palliative care, depending on patient condition and response. Embodiments of the present invention may further include diagnostic, prognostic, and therapeutic systems integrated with cancer stages 200, thereby enabling stage-specific detection, monitoring, and treatment planning.
[0050] FIG. 3 depicts a flowchart of a method 300 for detecting lung and colon cancer using the cancer detection system 100, according to an embodiment of the present invention.
[0051] At step 302, the cancer detection system 100 may collect the histopathological image data from the patient through the gateway device 102.
[0052] At step 304, the cancer detection system 100 may preprocess the collected image data to enhance the image quality. The collected image data may be pre-processed using preprocessing techniques such as the noise reduction, the normalization, the image resizing, the contrast enhancement, and so forth.
[0053] At step 306, the cancer detection system 100 may transmit the pre-processed image data to the edge device 106.
[0054] At step 308, the cancer detection system 100 may execute the customized convolutional neural network (CNN) model on the edge device 106 to extract the features corresponding to the cancer stages 200 progression.
[0055] At step 310, the cancer detection system 100 may optimize the convolutional neural network (CNN) model through the hyperparameter tuning to improve classification accuracy and reduce inference error.
[0056] At step 312, the cancer detection system 100 may classify the image data into the one or more cancer stages 200 based on the extracted features.
[0057] At step 314, the cancer detection system 100 may transmit the classification results to the cloud server 110 for further analysis, storage, and remote accessibility.
[0058] At step 316, the cancer detection system 100 may provide the cancer diagnosis output in real time to the clinicians using the output interface 104 associated with the gateway device 102.
[0059] While the invention has been described in connection with what is presently considered to be the most practical and various embodiments, it is to be understood that the invention is not to be limited to the disclosed embodiments, but on the contrary, is intended to cover various modifications and equivalent arrangements included within the scope of the appended claims.
[0060] This written description uses examples to disclose the invention, including the best mode, and also to enable any person skilled in the art to practice the invention, including making and using any devices or systems and performing any incorporated methods. The patentable scope of the invention is defined in the claims, and may include other examples that occur to those skilled in the art. Such other examples are intended to be within the scope of the claims if they have structural elements that do not differ from the literal language of the claims, or if they include equivalent structural elements within substantial differences from the literal languages of the claims. , Claims:CLAIMS
I/We Claim:
1. A cancer detection system (100) for detecting lung and colon cancer using cloud-edge computing and deep learning, the cancer detection system (100) comprising:
a gateway device (102) adapted to collect histopathological image data from a patient;
an edge device (106) in communication with the gateway device (102), the edge device (106) configured to:
preprocess the collected image data to enhance image quality, and
execute a customized convolutional neural network (CNN) model to extract features and classify the image data into one or more cancer stages (200); and
a cloud server (110) communicatively coupled with the edge device (106), characterized in that the cloud server (110) configured to:
receive the classified results from the edge device (106);
perform further analysis and storage of the results;
provide remote access to the results for clinicians; and
display real-time cancer diagnosis results to the clinicians using an output interface (104) associated with the gateway device (102).
2. The cancer detection system (100) as claimed in claim 1, wherein the gateway device (102) is selected from a smartphone, a tablet, a wearable device, or a combination thereof.
3. The cancer detection system (100) as claimed in claim 1, wherein the edge device (106) is configured to perform preprocessing techniques selected from noise reduction, normalization, image resizing, contrast enhancement, or a combination thereof.
4. The cancer detection system (100) as claimed in claim 1, wherein the convolutional neural network (CNN) model is optimized through a hyperparameter tuning to improve classification accuracy.
5. The cancer detection system (100) as claimed in claim 1, wherein the edge device (106) is configured to operate in a fog computing environment (108) to reduce latency and enable faster analysis.
6. The cancer detection system (100) as claimed in claim 1, wherein the edge device (106) is configured to identify cancer stages (200) selected from a stage 0 cancer (202), a stage 1 cancer (204), a stage 2 cancer (206), a stage 3 cancer (208), and a stage 4 cancer (210) based on the classified image data.
7. A method (300) for detecting lung and colon cancer using a cancer detection system (100), the method (300) is characterized by steps of:
collecting histopathological image data from a patient through a gateway device (102);
preprocessing, using preprocessing techniques, the collected image data to enhance image quality;
transmitting the pre-processed image data to an edge device (106);
executing a customized convolutional neural network (CNN) model on the edge device (106) to extract features corresponding to cancer stages (200) progression;
classifying the image data into the one or more cancer stages (200) based on the extracted features;
transmitting classification results to a cloud server (110) for further analysis, storage, and remote accessibility; and
providing a cancer diagnosis output in real time to clinicians using an output interface (104) associated with the gateway device (102).
8. The method (300) as claimed in claim 7, comprising step of optimizing the convolutional neural network (CNN) model through a hyperparameter tuning to improve classification accuracy and reduce inference error.
9. The method (300) as claimed in claim 7, wherein the preprocessing techniques selected from noise reduction, normalization, image resizing, contrast enhancement, or a combination thereof.
10. The method (300) as claimed in claim 7, wherein the cancer stages (200) selected from Stage 0, Stage 1, Stage 2, Stage 3, or Stage 4 based on the classified image data.
Date: August 28, 2025
Place: Noida
Nainsi Rastogi
Patent Agent (IN/PA-2372)
Agent for the Applicant
| # | Name | Date |
|---|---|---|
| 1 | 202541082013-STATEMENT OF UNDERTAKING (FORM 3) [29-08-2025(online)].pdf | 2025-08-29 |
| 2 | 202541082013-REQUEST FOR EARLY PUBLICATION(FORM-9) [29-08-2025(online)].pdf | 2025-08-29 |
| 3 | 202541082013-POWER OF AUTHORITY [29-08-2025(online)].pdf | 2025-08-29 |
| 4 | 202541082013-OTHERS [29-08-2025(online)].pdf | 2025-08-29 |
| 5 | 202541082013-FORM-9 [29-08-2025(online)].pdf | 2025-08-29 |
| 6 | 202541082013-FORM FOR SMALL ENTITY(FORM-28) [29-08-2025(online)].pdf | 2025-08-29 |
| 7 | 202541082013-FORM 1 [29-08-2025(online)].pdf | 2025-08-29 |
| 8 | 202541082013-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [29-08-2025(online)].pdf | 2025-08-29 |
| 9 | 202541082013-EDUCATIONAL INSTITUTION(S) [29-08-2025(online)].pdf | 2025-08-29 |
| 10 | 202541082013-DRAWINGS [29-08-2025(online)].pdf | 2025-08-29 |
| 11 | 202541082013-DECLARATION OF INVENTORSHIP (FORM 5) [29-08-2025(online)].pdf | 2025-08-29 |
| 12 | 202541082013-COMPLETE SPECIFICATION [29-08-2025(online)].pdf | 2025-08-29 |
| 13 | 202541082013-Proof of Right [18-11-2025(online)].pdf | 2025-11-18 |