Abstract: A method (200) for identification of specific fungal infection at point-of-care using Artificial Intelligence, comprising steps of receiving a plurality of clinical input parameters and clinical image of lesion related to a patient (210), selecting highest probability model, based upon the received plurality of clinical input parameters and clinical image related to the patient (220), receiving a microscopic image of test for confirmation of the fungal etiology of infection (230), determining a plurality of attributes of fungal infection based upon the received test image via artificial intelligence (240) and displaying confirmed specific fungal infection (250). Further, a system (400) for identification of fungal infection is provided at point-of-care using artificial intelligence. [Figure 3]
The following specification particularly describes the invention and the manner in which it is to be performed
FIELD OF THE INVENTION
Embodiments of the present invention relates to fungal infection detection and more particularly to a method and a system for identification of specific fungal infection at point-of-care using artificial intelligence.
BACKGROUND OF THE INVENTION
Fungal infection is a major cause of human health problems. There are greater than 300 million patients afflicted worldwide with a fungal infection. Most common fungal infections include a) Candidal infections like Oropharyngeal candidiasis (Thrush), Vulvovaginitis (Vaginal candidiasis), Balanitis (Penile candidiasis) b) Dermatophyte infections like Tinea capitis, Tinea pedis, Tinea unguium, Tinea cruris, Tinea barbae, Tinea manuum, Tinea corporis and c) Malassezia infections like Tinea versicolor.
In order to diagnose a fungal infection, a diagnostician must have first observed symptoms or suspected other reasons in a patient that indicate that the patient might be suffering from a fungal infection. The presence of a fungal pathogenic agent and the identification of a particular species of fungus responsible for causing such symptoms can generally only be confirmed by taking a biological sample from the patient and culturing the sample.
Clinical diagnosis and treatment of fungi suffers several shortcomings. Fungi generally grow slower than the major barceremic organisms, and consequently diagnosis requiring an in vitro culture step is time consuming and there is always a probability of contamination. Also, some of the fungi (again in diagnoses requiring in vitro cultivation) will not yield colonies on synthetic media for weeks, if at all. All of these factors, plus the fact that a wide array of fungi are potential systemic pathogens, point to the need for a direct method of fungal detection inclusive for virtually all fungi.
Therefore, there is a need in the art for an efficient and cost-effective method and system for identification of specific fungal infections, more particularly, rapid detection of specific fungal infection at point- of- care, which ameliorates all or some of the deficiencies of the prior art or at least provides a viable alternative.
OBJECTS OF THE INVENTION
The object of the present invention is to disclose a method and a system for identification of fungal infection using artificial intelligence.
Another object of the present invention is to provide a method and a system for identification of specific fungal infection using Artificial Intelligence that rapidly predicts desired results for various types of fungal diseases and infections at point- of- care.
Yet another object of the present invention is to provide a method and a system for identification of fungal infection using artificial intelligence which is efficient, cost-effective and provides accurate diagnosis of specific fungal diseases at point- of- care.
SUMMARY OF THE INVENTION
The present invention is described hereinafter by various embodiments. This invention may, however, be embodied in many different forms and should not be construed as limited to the embodiment set forth herein. Rather, the embodiment is provided so that this disclosure will be thorough and complete and will fully convey the scope of the invention to those skilled in the art.
According to first aspect of the invention, a method for identification of fungal infection using Artificial Intelligence is disclosed. The method comprises steps of, but not limited to, receiving a plurality of clinical input parameters and clinical image of lesion related to a patient, determining the highest probability model , based upon the received plurality of clinical input parameters and image of lesion related to the patient, receiving a microscopic image of test performed for confirmation of fungal etiology of infection, determining a plurality of attributes of fungal etiology based upon the received test image and displaying confirmed specific fungal infection.
In accordance with embodiment of the invention, the plurality of clinical input parameters related to the patient is selected from a group consisting of, but not limited to, a plurality of clinical questions and a plurality of clinical symptoms.
In accordance with embodiment of the invention, one or more higher probability models are determined from stored plurality of probability models based upon the received plurality of clinical input parameters related to the patient. The highest probability model is displayed from shortlisted models by matching clinical image of lesion via artificial intelligence.
In accordance with embodiment of the invention, the plurality of attributes of fungal etiology of infection is selected from a group consisting of, but not limited to, budding, hyphae and spores.
In accordance with embodiment of the invention, the plurality of clinical questions is selected from a group consisting of, but not limited to, gender, income, living environment, clinical history, site of infection, pain, bleeding and duration of infection.
In accordance with embodiment of the invention, the plurality of clinical symptoms is selected from a group consisting of, but not limited to, burning, itching, vaginal discharge, rash, chills, changes in nails, irritation, swelling, discolored patches, skin scraping, increased sweating, thick and lumpy discharge under the foreskin, fever, redness, contagious skin and annular node.
In accordance with embodiment of the invention, the plurality of clinical input parameters, image of lesion and microscopic image of test for fungal etiology related to the patient is stored in a storage device.
In accordance with embodiment of the invention, the confirmed specific fungal infection is displayed to the patient.
In accordance with embodiment of the invention, a system for identification of fungal infection is disclosed. The system comprises a control module and an interface module. Further, the control module is configured to, but not limited to, receive a plurality of clinical input parameters and image of lesion related to a patient, receive a microscopic image of test performed for confirmation of fungal etiology of infection and determine a plurality of attributes of fungal etiology of infection based upon the received test image. Further, the interface module is configured to, but not limited to, display the highest probability infection model, based upon the received plurality of clinical input parameters and image of lesion related to the patient and display confirmation of presence of fungal etiology of infection.
In accordance with embodiment of the invention, the system comprises a data- management module configured to, but not limited to, store the plurality of clinical input parameters and images of lesion and microscopic image of test of fungal etiology related to the patient in the storage device.
In accordance with embodiment of the invention, the system further comprises a reporting module configured to display the test result to the patient.
BRIEF DESCRIPTION OF THE ACCOMPANYING DRAWINGS
So that the manner in which the above recited features of the present invention can be understood in detail, a more particular description of the invention, briefly summarized above, may have been referred by embodiments, some of which are illustrated in the appended drawings. It is to be noted, however, that the appended drawing illustrate only typical embodiments of this invention and are therefore not to be considered limiting of its scope, for the invention may admit to other equally effective embodiments.
These and other features, benefits, and advantages of the present invention will become apparent by reference to the following text figure, with like reference numbers referring to like structures across the views, wherein:
Figure 1 illustrates an exemplary environment diagram to which various embodiments of the present invention may be implemented;
Figure 2 illustrates a method for identification of specific fungal infection, in accordance with an embodiment of the present invention; and
Figure 3 illustrates a flow diagram for identification of specific fungal infection using Artificial Intelligence, in accordance with an embodiment of the present invention.
Figure 4 illustrates a system for identification of fungal infection, in accordance with an embodiment of the present invention.
DETAILED DESCRIPTION OF THE ACCOMPANYING DRAWINGS
While the present invention is described herein by way of example using embodiments and illustrative drawings, those skilled in the art will recognize that the invention is not limited to the embodiments of drawing or drawings described, and are not intended to represent the scale of the various components. Further, some components that may form a part of the invention may not be illustrated in certain figures for ease of illustration, and such omissions do not limit the embodiments outlined in any way. It should be understood that the drawings and detailed description thereto are not intended to limit the invention to the particular form disclosed, but on the contrary, the invention is to cover all modifications, equivalents and alternatives falling within the scope of the present invention as defined by the appended claim. As used throughout this description, the word "may" is used in a permissive sense (i.e. meaning having the potential to), rather than the mandatory sense (i.e. meaning must). Further, the words "a" or "an" mean "at least one” and the word “plurality” means “one or more” unless otherwise mentioned. Furthermore, the terminology and phraseology used herein is solely used for descriptive purposes and should not be construed as limiting in scope. Language such as "including," "comprising," "having," "containing," or "involving," and variations thereof, is intended to be broad and encompass the subject matter listed thereafter, equivalents, and additional subject matter not recited, and is not intended to exclude other additives, components, integers or steps. Likewise, the term "comprising" is considered synonymous with the terms "including" or "containing" for applicable legal purposes. Any discussion of documents, acts, materials, devices, articles and the like is included in the specification solely for the purpose of providing a context for the present invention. It is not suggested or represented that any or all of these matters form part of the prior art base or were common general knowledge in the field relevant to the present invention.
In this disclosure, whenever a composition or an element or a group of elements is preceded with the transitional phrase “comprising”, it is understood that we also contemplate the same composition, element or group of elements with transitional phrases “consisting of”, “consisting”, “selected from the group of consisting of, “including”, or “is” preceding the recitation of the composition, element or group of elements and vice versa.
The present invention is described hereinafter by various embodiments with reference to the accompanying drawing, wherein reference numerals used in the accompanying drawing correspond to the like elements throughout the description. This invention may, however, be embodied in many different forms and should not be construed as limited to the embodiment set forth herein. Rather, the embodiment is provided so that this disclosure will be thorough and complete and will fully convey the scope of the invention to those skilled in the art. In the following detailed description, numeric values and ranges are provided for various aspects of the implementations described. These values and ranges are to be treated as examples only, and are not intended to limit the scope of the claims. In addition, numbers of materials are identified as suitable for various facets of the implementations. These materials are to be treated as exemplary, and are not intended to limit the scope of the invention.
Referring to the drawings, the invention will now be described in more detail. Figure 1 illustrates an exemplary environment diagram to which various embodiments of the present invention may be implemented. As shown in Figure 1, the environment (100) comprises, but not limited to, a user device (102) connected to a server (108) via network (104). In accordance with various embodiments, the user device (102) is selected from a group consisting of, but not limited to, mobile handheld device, personal computer, desktop, laptop, and tablet etc. Further, in accordance with various embodiments, the network (104) is a Local Area Network (LAN) or a Wide Area Network (WAN). Preferably, the network (104) is internet. The server (108) is connected to the network (104) by any suitable means, such as, for example, hardwired and/or wireless connections, such as dial-up, hardwired, cable, Digital Subscriber Line (DSL), satellite, cellular, Personal Communications Service (PCS), wireless transmission. In accordance with an embodiment of the present invention, the server (108) is a cloud based server.
In one embodiment of the invention the server (108) is web server and/or an Application Programming Interface (API) server. The web server is capable of receiving the input parameters of the user. Further, server (108) comprises a storage device (110). The server (108) is configured to receive the complete inventory details of the patient which is further stored in a storage device (110). In accordance with an embodiment, the storage device (110) is a cloud based storage device (110). In accordance with another embodiment, the storage device (110) is a local storage device or a dedicated web based storage device. Further the environment (100) includes a database (106). The database (106) is connected to the server (108). The database (106) is configured to store the predetermined clinical input parameters of multiple patients into the storage device (110).
Further, the user device (102) contains elements such as, but not limited to, a screen (1022) and an input device (1024). The screen (1022) is one of, but not limited to a liquid crystal display (LCD) screen, Thin film transistor (TFT), a (light-emitting diode) LED screen, (organic light-emitting diode) O-LED screen and (Active matrix organic light-emitting diode) AMOLED screen. In accordance with one embodiment of present invention, the screen (1022) is capacitive and/or resistive touch sensitive screen (1022). Further, the input device (1024) is one of, but not limited to a camera, keyboard, image scanner, joystick, microphone, pointing device, light pen, touch screen, touch pad and mouse.
Figure 2 illustrates a method (200) for identification of specific fungal infection using Artificial Intelligence, in accordance with an embodiment of the present invention. The method begins at step 210 by, receiving a plurality of clinical input parameters and image of lesion related to a patient where the plurality of clinical input parameters related to the patient are fed to the user device by a user. Further, the plurality of clinical input parameters related to the patient is selected from a group consisting of, but not limited to, a plurality of clinical questions and a plurality of clinical symptoms. Further, the plurality of clinical questions is selected from a group consisting of, but not limited to, gender, income, living environment, clinical history, site of infection, pain, bleeding and duration of infection. The gender can be further classified into male and female. The income can be further classified into low income, medium income and high income. The living environment can be further classified into humid, cold, hot and moderate. The clinical history can be further classified into HIV, Hepatitis, weak immune, TB, cancer, chemotherapy, diabetes, broad spectrum antibiotic use, dentures, tobacco, cigarette, alcohol use and pregnancy (one or more). The site of infection can be further classified into mouth, throat, scalp, hand, breast, nail, foot, groin, beard, lips, back, leg and full body. The choice of pain is categorised as yes or no. If the patient is experiencing pain, the user selects yes otherwise no. The choice of bleeding is categorised as yes or no. If the patient is experiencing bleeding, the user selects yes otherwise no. The duration of infection can be further classified into less than two weeks and greater than two weeks. Further, the plurality of clinical symptoms is selected from a group consisting of, but not limited to, burning, itching, vaginal discharge, rash, chills, changes in nails, irritation, swelling, discoloured patches, skin scraping, increased sweating, thick and lumpy discharge under the foreskin, fever, redness, contagious skin and annular node. In accordance with the embodiments of the present invention, the plurality of clinical input parameters related to the patient, includes a photographic image of the lesion.
In accordance with the embodiments of the present invention, the predetermined clinical input parameters and clinical images of lesions are stored in the database (106). The predetermined clinical input parameters and clinical images are characterized into plurality of probability models such that the probability models represent the various types of the fungal infections related to the patient. Further, the clinical input parameters stored in the database are retrieved and compared with the received plurality of clinical input parameters related to the patient. Further, the comparison provides the one or higher probability models from the stored plurality of probability models. The received clinical image is compared via artificial intelligence with the pre-determined images of lesions to pick the highest probability model. The highest probability model represents the specific type of the fungal infection having same characteristics as present in the clinical input parameters and clinical image related to the patient.
At step 220, one or more higher probability models are determined, based upon the received plurality of clinical input parameters. At step 225, the highest probability model is selected from shortlisted models by matching clinical image of lesion via artificial intelligence. Further, a test is performed to confirm the fungal etiology of infection. The test is explained in the following manner. Exemplary procedure of the test is provided below assuming patient has infection in groin region:
A sample of a scrubbed skin is taken and put on a slide. Further, 10% KOH or 20% KOH solution is put on the slide, based on the density of the sample, and is left for 1-2 minutes. The slide is slightly heated or left for 5-10 minutes to dry the sample. Further, the slide is set on the microscope and is properly focussed at 10x or 40x zoom. Further, an image of the focussed slide is captured using an android phone camera. Further, the image is uploaded on the server.
At step 230, a microscopic image of test performed for confirmation of the fungal etiology of infection is received by the server in form of a captured photographic image using the user device (102) and is compared via artificial intelligence to stored models for the presence of fungal infection. At step 240, a plurality of attributes of fungal infection are determined from a group consisting of, but not limited to, budding, hyphae and spores. At step 250, highest probability model is displayed to user on confirmation of fungal etiology of infection.
Figure 3, illustrates the flow diagram for identification of specific fungal infection using Artificial intelligence (300) in accordance with the embodiment of the present invention. As shown in Figure 3, clinical questions (301), clinical symptoms (302) and clinical image of lesion(303) are collected, respectively. At step 304, clinical questions (301), clinical symptoms (302) and clinical image (303) of the site of infection are processed against one or more Artificial Intelligence (AI) model that run on the basis of given human instructions which returns prediction results. . At step 304, assuming the site of infection is mouth and history details include one of, but not limited to, HIV/ AIDS, cancer treatment, organ transplantation, diabetes, corticosteroid use, dentures, broad spectrum antibiotic use, tobacco/cigarette/alcohol, then at step 307, a high probability of white patches or plaque on the tongue and other oral mucous membranes is detected with the use of Artificial Intelligence. Also, redness or soreness in the affected area is detected. At step 305, assuming the gender is female, site of infection is groin/ genital and history details include one of, but not limited to, pregnancy, diabetes, broad spectrum antibiotic use and corticosteroid use, then at step 308, a high probability rashes or vaginal discharge is detected with the use of Artificial Intelligence. Similar steps are followed to determine likely match with each one of the different types of fungal infections supported by the invention. At step 306, assuming clinical parameters and symptoms matches none of the predetermined infection type’s match or Artificial Intelligence cannot match clinical image of lesion, then fungal infection is undetermined.
Further, refering to Figure 3, a test is run to confirm fungal etiology of infection. Further, a swab slide with KOH solution and saline is prepared and the swab slide is focussed using microscope. Picture of the focussed slide is taken, and is processed via artificial intelligence against the stored models and confirmatory results are obtained.
At step 307, assuming that the condition is true, then Artificial Intelligence detects high probability hyphae, budding or spores in microscopic image of KOH preparation at step 309. Further, it confirms the presence of thrush. At step 308, assuming that the condition is true, then Artificial Intelligence detects high probability hyphae, budding or spores in microscopic image of KOH preparation at step 310. It further confirms the presence of Vaginal candidiasis. Similarly, for any infection type if the first condition is true, then Artificial intelligence is used to confirm fungal etiology by detecting hyphae, budding or spores in microscopic image of KOH preparation and the confirmed fungal infection is displayed to the user
Further, referring to Figure 3, the steps as explained in the flowchart are used to determine each one of the different types of fungal infections supported by the invention.
Figure 4 illustrates a system (400) for identification of fungal infection using Artificial Intelligence in accordance with the embodiment of the present invention. As shown in figure 4, the system (400) comprises a control module (405) and an interface module (410). Further, the control module (405) is configured to receive plurality of clinical input parameters and clinical image of lesion related to a patient, to receive a microscopic image of test performed for confirmation of fungal etiology of infection and to determine a plurality of attributes of the fungal infection based upon the received test result. Further, the interface module is configured to display the highest probability model, based upon the received plurality of clinical input parameters and clinical image related to the patient and upon confirmation of fungal etiology from microscopic image of test.
In accordance with the embodiment of the present invention, the plurality of clinical input parameters related to the patient is selected from a group consisting of, but not limited to, a plurality of clinical questions and a plurality of clinical symptoms. Further, the plurality of clinical questions is selected from a group consisting of, but not limited to, gender, income, living environment, clinical history, site of infection, pain, bleeding and duration of infection. Further, the plurality of clinical symptoms is selected from a group consisting of, but not limited to, burning, itching, vaginal discharge, rash, chills, changes in nails, irritation, swelling, discolored patches, skin scraping, increased sweating, thick and lumpy discharge under the foreskin, fever, redness, contagious skin and annular node.
In accordance with the embodiment of the present invention, one or more higher probability models are determined from stored plurality of probability models.
In accordance with the embodiment of the present invention, the received clinical image is matched via artificial intelligence against stored clinical images to select the highest probability model.
In accordance with the embodiment of the present invention, the received microscopic image of test for confirmation of fungal etiology is matched via artificial intelligence and plurality of attributes of fungal infection are determined from a group consisting of, but not limited to, budding, hype and spores.
In accordance with the embodiment of the present invention, as shown in Figure 4, the system (400) comprises a data-management module (415). Further, the data-management module is configured to store the plurality of clinical input parameters, clinical image of the lesion and microscopic image of test for confiming fungal etiology related to the patient in the storage device (110).
In accordance with the embodiment of the present invention, as shown in Figure 4, the system (400) comprises a reporting module (420). Further, the reporting module (420) is configured to display the test result to the patient.
The method (200) and the system (400) described above offer a number of advantages. The present invention provides an easy and efficient manner for the identification of fungal infection. Further the method (200) for the identification of fungal infection assists the patient to get himself checked for the presence of fungal infection in a cost effective way. Further the method (200) helps in rapid detection of fungal infection at "point-of-care" using Artificial Intelligence.
Further, the method (200) and the system (400) for the identification of fungal infection do not require a pathologist to perform the patient screening. Also, the use of mobile application is simple and any user can perform the task.
Various modifications to these embodiments are apparent to those skilled in the art from the description and the accompanying drawings. The principles associated with the various embodiments described herein may be applied to other embodiments. Therefore, the description is not intended to be limited to the embodiments shown along with the accompanying drawings but is to be providing broadest scope of consistent with the principles and the novel and inventive features disclosed or suggested herein. Accordingly, the invention is anticipated to hold on to all other such alternatives, modifications, and variations that fall within the scope of the present invention and appended claims.
Claims:We claim
1.A method (200) for identification of specific fungal infection using Artificial Intelligence, the method (200) comprising steps of:
receiving (210) a plurality of clinical input parameters and clinical image of lesion related to a patient;
displaying (220) the highest probability model, based upon the received plurality of clinical input parameters and clinical image related to the patient;
receiving (230) a microscopic image of test performed for confirmation of fungal infection;
determining (240) a plurality of attributes of fungal etiology of infection based upon the received test image; and
displaying (250) the confirmed specific fungal infection.
2. The method (200) as claimed in claim 1, wherein the plurality of clinical input parameters related to the patient is selected from a group consisting of, a plurality of clinical questions and a plurality of clinical symptoms.
3. The method (200) as claimed in claim 1, wherein the one or more higher probability models are determined from stored plurality of probability models.
4. The method (200) as claimed in claim 1, where the highest probability model is selected from shortlisted models by matching clinical image of lesion via artificial intelligence
5. The method (200) as claimed in claim 1, wherein the microscopic image of fungal etiology is matched via artificial intelligence and plurality of attributes of fungal infection are determined from a group consisting of, budding, hyphae and spores.
6. The method (200) as claimed in claim 2, wherein the plurality of clinical questions is selected from a group consisting of, gender, income, living environment, clinical history, site of infection, pain, bleeding and duration of infection.
7. The method (200) as claimed in claim 2, wherein the plurality of clinical symptoms is selected from a group consisting of, burning, itching, vaginal discharge, rash, chills, changes in nails, irritation, swelling, discolored patches, skin scraping, increased sweating, thick and lumpy discharge under the foreskin, fever, redness, contagious skin and annular node.
8. The method (200) as claimed in claim 1, wherein the plurality of clinical input parameters, clinical image of lesion and microscopic image of test of fungal etiology related to the patient is stored in a storage device (110).
9. The method (200) as claimed in claim 1, wherein the test result is displayed to the patient.
10. A system (400) for identification of fungal infection, the system comprising:
a control module (405); and
an interface module (410);
wherein the control module (405) is configured to:
receive a plurality of clinical input parameters and clinical image of lesion related to a patient;
receive a microscopic image of test performed for confirmation of fungal etiology of infection; and
determine a plurality of attributes of fungal infection based upon the received microscopic image of test;
wherein the interface module (410) is configured to:
display the highest probability model, based upon the received plurality of clinical input parameters and clinical image of lesion related to the patient; and confirmation of fungal etiology of infection.
11. The system (400) as claimed in claim 9, wherein the plurality of clinical input parameters related to the patient is selected from a group consisting of, a plurality of clinical questions and a plurality of clinical symptoms.
12. The system (400) as claimed in claim 9, wherein one or more higher probability models is selected from stored plurality of probability models.
13. The system (400) as claimed in claim 9, wherein the highest probability model is selected from the shortlisted models by using artificial intelligence to match clinical image of lesion of patient against stored clinical images of lesion
14. The system (400) as claimed in claim 8, wherein the microscopic image of test of fungal etiology is matched to determine plurality of attributes of fungal infection from a group consisting of, budding, hyphe and spores.
15. The system (400) as claimed in claim 9, wherein the plurality of clinical questions is selected from a group consisting of, gender, income, living environment, clinical history, site of infection, pain, bleeding and duration of infection.
16. The system (400) as claimed in claim 9, wherein the plurality of clinical symptoms is selected from a group consisting of, burning, itching, vaginal discharge, rash, chills, changes in nails, irritation, swelling, discolored patches, skin scraping, increased sweating, thick and lumpy discharge under the foreskin, fever, redness, contagious skin and annular node.
17. The system (400) as claimed in claim 8 further comprises, a data- management module (415) configured to store the plurality of clinical input parameters, clinical image of lesion and microscopic image of test of fungal etiology related to the patient in the storage device (110).
18. The system as claimed in claim 8 further comprises, a reporting module (420) configured to display the test result to the patient.
Dated this the 13th day of July 2017.
| # | Name | Date |
|---|---|---|
| 1 | 201711024811-STATEMENT OF UNDERTAKING (FORM 3) [13-07-2017(online)].pdf | 2017-07-13 |
| 2 | 201711024811-FORM FOR SMALL ENTITY(FORM-28) [13-07-2017(online)].pdf | 2017-07-13 |
| 3 | 201711024811-FORM FOR SMALL ENTITY [13-07-2017(online)].pdf | 2017-07-13 |
| 4 | 201711024811-FORM 1 [13-07-2017(online)].pdf | 2017-07-13 |
| 5 | 201711024811-FIGURE OF ABSTRACT [13-07-2017(online)].pdf | 2017-07-13 |
| 6 | 201711024811-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [13-07-2017(online)].pdf | 2017-07-13 |
| 7 | 201711024811-EVIDENCE FOR REGISTRATION UNDER SSI [13-07-2017(online)].pdf | 2017-07-13 |
| 8 | 201711024811-DRAWINGS [13-07-2017(online)].pdf | 2017-07-13 |
| 9 | 201711024811-DECLARATION OF INVENTORSHIP (FORM 5) [13-07-2017(online)].pdf | 2017-07-13 |
| 10 | 201711024811-COMPLETE SPECIFICATION [13-07-2017(online)].pdf | 2017-07-13 |
| 11 | abstract.jpg | 2017-07-26 |
| 12 | 201711024811-Proof of Right (MANDATORY) [04-03-2019(online)].pdf | 2019-03-04 |
| 13 | 201711024811-FORM-26 [04-03-2019(online)].pdf | 2019-03-04 |
| 14 | 201711024811-Power of Attorney-060319.pdf | 2019-03-09 |
| 15 | 201711024811-OTHERS-060319.pdf | 2019-03-09 |
| 16 | 201711024811-Correspondence-060319.pdf | 2019-03-09 |
| 17 | 201711024811-FORM 18 [18-07-2019(online)].pdf | 2019-07-18 |
| 18 | 201711024811-OTHERS [08-06-2021(online)].pdf | 2021-06-08 |
| 19 | 201711024811-FER_SER_REPLY [08-06-2021(online)].pdf | 2021-06-08 |
| 20 | 201711024811-DRAWING [08-06-2021(online)].pdf | 2021-06-08 |
| 21 | 201711024811-CLAIMS [08-06-2021(online)].pdf | 2021-06-08 |
| 22 | 201711024811-ABSTRACT [08-06-2021(online)].pdf | 2021-06-08 |
| 23 | 201711024811-FER.pdf | 2021-10-17 |
| 24 | 201711024811-US(14)-HearingNotice-(HearingDate-02-11-2023).pdf | 2023-09-25 |
| 25 | 201711024811-FORM-26 [31-10-2023(online)].pdf | 2023-10-31 |
| 26 | 201711024811-Correspondence to notify the Controller [31-10-2023(online)].pdf | 2023-10-31 |
| 27 | 201711024811-Written submissions and relevant documents [16-11-2023(online)].pdf | 2023-11-16 |
| 28 | 201711024811-PatentCertificate18-01-2024.pdf | 2024-01-18 |
| 29 | 201711024811-IntimationOfGrant18-01-2024.pdf | 2024-01-18 |
| 1 | NPL1E_10-12-2020.pdf |
| 2 | 2020-12-1012-35-46E_10-12-2020.pdf |