Sign In to Follow Application
View All Documents & Correspondence

System And Method For Hybrid Edge Artificial Intelligence

Abstract: The present disclosure relates to system (100) and method for hybrid edge artificial intelligence on a clip from audio video source (110), connected over a network (145). The clip is recorded by the audio video source (110) upon detection of motion in a video captured by the video source (110) and sent to the edge system (105) for processing. The edge system (105) collects and processes artificial intelligence outliers from audio video source (110). The server system (115) performs rebuilding and delta compilation of artificial intelligence outliers. Edge artificial intelligence module is updated by utilizing edge LCM module over the network (150) by downloading latest artificial intelligence model from the server system (115).

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
20 August 2019
Publication Number
35/2019
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
anand.subhash@conceptbytes.com
Parent Application
Patent Number
Legal Status
Grant Date
2020-06-10
Renewal Date

Applicants

Concept Realization and IT Solutions Private Limited
Level 9, 10th Floor, Brigade IRV, Narullahalli, Whitefield, Bangalore, Karnataka - 56066

Inventors

1. Anand Subhash
3E Olive, SFS Cyber Palms, NH Bypass, Karimanal, Thiruvananthapuram, Kerala 695583, India
2. Manoon Valiyaparambil
Valiyaparambil House, Neriamangalam P.O, Ernakulam Dist, Kerala 686693

Specification

Claims:WE CLAIM:
1. A system (100) for hybrid edge artificial intelligence on audio video source (110) utilizing a server system (115) over a network (150), the system (100) characterized in that comprising:
an audio video source (110) for recording a video clip comprising an event of interest;
at least one edge system (105) for collecting and processing artificial intelligence outliers from said audio video source (110), wherein said edge system comprises at least one first processor (120), at least one first memory (125) and at least one first I/O interface (130),
the first processors configured to perform processing of artificial intelligence outliers, wherein said first processor comprises,
a inference validator module (165) for detecting artificial intelligence outliers on audio video source (110), based on output generated by edge artificial intelligence module (160), where in magnitude of outliers on audio video clip is based at least on decision of inference validator module; and
a outlier management module (170) for uploading collected outliers to server system (115), wherein retry mechanism of outlier management module ensures upload based on conditions of both the network (150) and the server system (115),
at least one server system (115) for rebuilding at least one artificial intelligence model based on artificial intelligence outliers; wherein said server system (115) comprises at least one second processor (135), at least one second memory (140) and at least one second I/O interface (145),
the second processors (135) configured to perform rebuilding and delta compilation of said artificial intelligence outliers, wherein said second processor (135) comprises,
a outlier receiver module (180) for collecting artificial intelligence outliers from one or more edge system (105) though the network (150), and subsequently store collected outliers in the outlier store module (185);
a delta compiler module (190) for sort and merge received outliers in the outlier store module (185), into the data store module (195), where in magnitude of sort and merge of outliers from outlier store is based at least on decision of delta compiler module; and
a model rebuild module (200) for generating updated artificial intelligence model utilizing data in the data store module (195), and subsequently making generated artificial intelligence model available for download by one or more edge system (105) over the network (150) through the model LCM module (205); and
a edge LCM module (175) for updating the edge artificial intelligence module (160) of edge system (105) over the network (150) by downloading latest artificial intelligence model from the server system (115).

2. The system (100) as claimed in claim 1, wherein said inference validator module (165) identifies artificial intelligence outliers on audio video source (110) as valid or invalid outlier based statistical analysis on output from edge artificial intelligence module (160) by utilizing data from field references.

3. The system (100) as claimed in claim 2, wherein said inference validator module (165) enables or disables detection of artificial intelligence outliers on audio video source (110).


4. The system (100) as claimed in claim 1, wherein said delta compiler module (190) discards or sorts, outliers received in said outlier store module (185) based on quality and type of data, and merges to appropriate data section in the data store module (195).

5. The system (100) as claimed in claim 1, wherein said the edge artificial intelligence module (160) is updated by downloading recent artificial intelligence model from said server system (115), from said model LCM module (205) utilizing said edge LCM module (175).

6. A method (300, 400) for hybrid edge artificial intelligence on audio video source (110) utilizing server system (115) over the network (150), using system (100) as claimed in claim 1, the method (300, 400) characterized in that comprising:
detecting artificial intelligence outliers on audio video source (110), by inference validator module (165) based on output generated by edge artificial intelligence module (160), where in magnitude of outliers on audio video clip is based at least on decision of inference validator module;
uploading collected outliers to server system (115), by outlier management module (170), wherein retry mechanism of outlier management module ensures upload based on conditions of both the network (150) and the server system (115);
collecting artificial intelligence outliers from one or more edge system (105) though the network (150), by utilizing the outlier receiver module (180) subsequently store collected outliers in the outlier store module (185);
sort and merge received outliers in the outlier store module (185), by utilizing the delta compiler module (190) into the data store module (195), where in magnitude of sort and merge of outliers from outlier store is based at least on decision of delta compiler module;
generating updated artificial intelligence model utilizing data in the data store module (195), by utilizing the model rebuild module (200), subsequently making generated artificial intelligence model available for download by one or more edge system (105) over the network (150) through the model LCM module (205); and
updating edge system (105) the edge artificial intelligence module (160), by utilizing edge LCM module (175) over the network (150) by downloading latest artificial intelligence model from the server system (115).

7. The method (300, 400) as claimed in claim 6, further comprising of detecting artificial intelligence outliers on audio video source (110) as valid or invalid outlier, by the inference validator module (165), wherein the outlier is classified based statistical analysis on output from edge artificial intelligence module (160) by utilizing but not limited to data from field references.

8. The method (300, 400) as claimed in claim 6, further comprises of option to enable or disable detection of artificial intelligence outliers on audio video source (110) by the inference validator module (165).

9. The method (300, 400) as claimed in claim 6, wherein the delta compiler module (190), discards or sorts, outliers received in the outlier store module (185) based on quality and type of data, and merges to appropriate data section in the data store module (195).

10. The method (300, 400) as claimed in claim 6, further comprises of updating the edge artificial intelligence module (160), by downloading recent artificial intelligence model from the server system (115), from the model LCM module (205) utilizing the edge LCM module (175).
, Description:FIELD OF THE INVENTION
[0001] The present disclosure relates to hybrid edge artificial intelligence. In particular, the present disclosure relates to updating or improving artificial intelligence model at edge, based on outliers found over time.
BACKGROUND OF THE INVENTION
[0002] Video Surveillance system widely uses Artificial Intelligence to analyse the audio and images in order to recognize humans, vehicles, objects and events. The Artificial Intelligence module either resides at edge close to Video Surveillance system or resides at server connected over internet, typically a cloud server. When Artificial Intelligence module is at edge, right after deployment, the module stop receiving updates that is, it fails to understand new situations or patterns or changes in environment than anticipated initially. Further, if Artificial Intelligence module is at server connected over internet, there exist a need for every video/audio frame to be send to server for analysis, this can lead to unnecessary network loads. However, in latter case, Artificial Intelligence module has an option to learn and update itself for new situations or patterns. Therefore, it is important to have a hybrid approach to bring the benefit always evolving artificial intelligence model of server based to edge.

[0003] Video surveillance systems with server based Artificial Intelligence, however, require high throughput of data. As a result, consistent, high-speed internet connectivity is required for sending every video/audio frame to cloud server. In many locations, such as locations in developing countries or locations far remote from other infrastructure, consistent, high-speed internet connectivity is not available. Therefore, there is a need for a system and a method for updating edge Artificial Intelligence over time, in locations where consistent, high-speed internet connectivity is not available.
BRIEF SUMMARY OF THE INVENTION
[004] This summary is provided to introduce a selection of concepts in a simple manner that are further described in the detailed description of the disclosure. This summary is not intended to identify key or essential inventive concepts of the subject matter nor is it intended for determining the scope of the disclosure.

[005] An example of a method for hybrid edge artificial intelligence over network is disclosed. This method comprises of detecting Artificial Intelligence outliers from the analytics done on audio video source, by inference validator module. Further, the method comprises transmitting, by outlier management module and outlier receiver module, to remote server. Further, the method comprises of Artificial Intelligence model rebuild, by delta compiler module and model rebuild module, at remote server. Further, the method comprises updating, by model Life Cycle Management (LCM) module and edge module LCM, the artificial intelligence module at edge.

[006] An example of a system for hybrid edge artificial intelligence over network is disclosed. This system comprises of detecting Artificial Intelligence outliers from the analytics done on video source, by inference validator module. Further, the system comprises transmitting, by outlier management module, outlier receiver module, to remote server. Further, the system comprises of Artificial Intelligence model rebuild, by delta compiler module, model rebuild module, at remote server. Further, the system comprises updating, by model LCM module, edge LCM module, artificial intelligence module at edge

[007] To clarify advantages and features of the present disclosure further, a more particular description of the disclosure is rendered by reference to specific embodiments thereof, which is illustrated in the appended figures. It is to be appreciated that these figures depict only typical embodiments of the disclosure and are therefore not to be considered limiting of its scope. The disclosure is described and explained with additional specificity and detail with the accompanying figures.

BRIEF DESCRIPTION OF THE FIGURES
[008] The disclosure is described and explained with additional specificity and detail with the accompanying figures in which:

[009] FIG. 1A illustrates an environment of a system for hybrid edge artificial intelligence, in accordance with one embodiment of the present disclosure;

[0010] FIGS. 1B and 1C illustrate functional block diagrams of the system, in accordance with one embodiment of the present disclosure; and

[0011] FIGS. 2A, 2B and 3A show a method for hybrid edge artificial intelligence, in accordance with one embodiment of the present disclosure.

[0012] Further, persons skilled in the art to which this disclosure belongs may appreciate that elements in the figures are illustrated for simplicity and may not have been necessarily been drawn to scale. Furthermore, in terms of the construction of the system and, one or more components of the system may have been represented in the figures by conventional symbols, and the figures may show only those specific details that are pertinent to understanding the embodiments of the present disclosure so as not to obscure the figures with details that are readily apparent to those of ordinary skill in the art having benefit of the description herein.

DETAILED DESCRIPTION
[00013] For the purpose of promoting an understanding of the principles of the disclosure, reference is now made to the embodiment illustrated in the figures and specific language is used to describe them. It should nevertheless be understood that no limitation of the scope of the disclosure is thereby intended. Such alterations and further modifications to the disclosure, and such further applications of the principles of the disclosure as described herein being contemplated as would normally occur to one skilled in the art to which the disclosure relates are deemed to be a part of this disclosure.

[00014] It may be understood by those skilled in the art that the foregoing general description and the following detailed description are exemplary and explanatory of the disclosure and are not intended to be restrictive thereof.

[00015] The terms "comprises", "comprising", or any other variations thereof, are intended to cover a non-exclusive inclusion, such that a process or method that comprises a list of steps does not include only those steps but may include other steps not expressly listed or inherent to such a process or a method. Similarly, one or more devices or sub-systems or elements or structures or components preceded by "comprises... a" does not, without more constraints, preclude the existence of other devices, other sub-systems, other elements, other structures, other components, additional devices, additional sub-systems, additional elements, additional structures, or additional components. Appearances of the phrase “in an embodiment”, “in another embodiment” and similar language throughout this specification may, but do not necessarily, all refer to the same embodiment.

[00016] Unless otherwise defined, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this disclosure belongs. The system, methods, and examples provided herein are illustrative only and not intended to be limiting.

[00017] The present disclosure relates to a system and a method for hybrid edge artificial intelligence on one or more audio video sources, over network connected to remote server (or cloud server). More specifically, the system and method relate to identifying edge artificial intelligence outliers and further transmitting outliers to remote server. Upon receiving outliers, the remote system does delta compilation and subsequently triggers artificial intelligence model rebuild with updated capability to handle outliers. Further, updated artificial intelligence model is made available for edge, thus ensuring edge artificial intelligence remain updated over time in handling outliers in a hybrid manner.

[00018] Embodiments of the present disclosure are described below in detail with reference to the accompanying figures.

[00019] FIG. 1A refers to a system 100 for hybrid edge artificial intelligence, comprises a edge system 105 for identifying and uploading artificial intelligence outliers from one or more audio video sources 110 to a remote server system 115. The one or more audio video sources 110 includes, but is not limited to, a Digital Video Recorder (DVR), a Network Video Recorder (NVR), an IP camera or any other audio video streaming device known in the art. In one example, when the video source is a DVR, an NVR or an (Internet Protocol) IP camera, the audio video source records a video clip of the surroundings, in a digital format with the help of a motion analysis software, upon detection of motion in the video. In another example, the video source may comprise passive infrared sensors to detect motion based on body heat. Based on the output of the passive infrared sensors, the audio video source 110 may start recording the video clip.

[00020] When the audio video source 110 records a video clip comprising an event of interest, the video clip is sent to the edge system 105 for further processing. Then, artificial intelligence outlier information is sent to the server system 115 over a network 150.

[00021] The edge system 105 may comprise at least one first processor 120, at least one first memory 125 and at least one first Input/output (I/O) interface 130. The first processor 120 may be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, state machines, logic circuitry, and/or any devices that manipulate signals based on operational instructions. Among other capabilities, the at least one first processor 120 is configured to fetch and execute computer-readable instructions stored in the memory.

[00022] The first memory 125 may include any computer-readable medium known in the art including, for example, volatile memory, such as Static Random Access Memory (SRAM) and Dynamic Random Access Memory (DRAM), and/or non-volatile memory, such as Read Only Memory (ROM), erasable programmable ROM, flash memories, hard disks, optical disks, and magnetic tapes.

[00023] The first I/O interface 130 may include a variety of software and hardware interfaces, for example, a web interface, a graphical user interface, and the like. Further, the first I/O interface 130 may enable the edge system 105 to communicate with other computing devices, such as web servers and external data servers (not shown). The first I/O interface 130 may facilitate multiple communications within a wide variety of networks and protocol types, including wired networks, for example, LAN, cable, etc., and wireless networks, such as WLAN, cellular, or satellite. The first I/O interface 130 may include one or more ports for connecting a number of devices to one another or to a server.

[00024] Although the present disclosure is explained by considering that the edge system 105 is implemented on a server, it may be understood that the edge system 105 may also be implemented in a variety of computing systems, such as a mainframe computer, mobile, single board computer and the like. The edge system 105 communicates with the remote server 115 over a network 150.

[00025] In one implementation, the network 150 may be a wireless network, a wired network or a combination thereof. The network 150 may be implemented as one of the different types of networks, such as intranet, Local Area Network (LAN), wide area network (WAN), the internet, and the like. The network 150 may either be a dedicated network or a shared network. The shared network represents an association of different types of networks that use a variety of protocols, for example, Hypertext Transfer Protocol (HTTP), Transmission Control Protocol/Internet Protocol (TCP/IP), Wireless Application Protocol (WAP), and the like, to communicate with one another. Further the network 150 may include a variety of network devices, including routers, bridges, servers, computing devices, storage devices, and the like.

[00026] The server system 115 for receiving artificial intelligence outliers from one or more edge system 105, over the network 150 is shown, in accordance with one embodiment of the present disclosure.

[00027] The edge system 105 collects artificial intelligence outliers and sends to server system 115 over network 150 periodically or when network 150 connectivity is available.

[00028] The server system 115 further process artificial intelligence outlier data and generates updated version of artificial intelligence model. Further, the updated version of artificial intelligence model is sent to edge system 105 over network 150.

[00029] The server system 115 may comprise at least one second processor 135, a second memory 140 and a second I/O interface 145. The at least one second processor 135 may be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, state machines, logic circuitry, and/or any devices that manipulate signals based on operational instructions. Among other capabilities, the at least one second processor 135 is configured to fetch and execute computer-readable instructions stored in the memory.

[00030] The second memory 140 may include any computer-readable medium known in the art including, for example, volatile memory, such as Static Random Access Memory (SRAM) and Dynamic Random Access Memory (DRAM), and/or non-volatile memory, such as Read Only Memory (ROM), erasable programmable ROM, flash memories, hard disks, optical disks, and magnetic tapes.

[00031] The second I/O interface 145 may include a variety of software and hardware interfaces, for example, a web interface, a graphical user interface, and the like. Further, the second I/O interface 145 may enable the server system 115 to communicate with other computing devices, such as web servers and external data servers (not shown). The second I/O interface 145 may facilitate multiple communications within a wide variety of networks and protocol types, including wired networks, for example, LAN, cable, etc., and wireless networks, such as WLAN, cellular, or satellite. The second I/O interface 145 may include one or more ports for connecting a number of devices to one another or to a server.

[00032] Although the present disclosure is explained by considering that the server system 115 is implemented on a server, it may be understood that the server system 115 may also be implemented in a variety of computing systems, such as a mainframe computer, mobile, cloud server, single board computer and the like. The server system 115 communicates with one or more edge system 105 over the network 150.

[00033] Referring to FIG. 1B, functional blocks of the first processor 120 of edge system 105 is shown, in accordance with one embodiment of the present disclosure. The first processor 120 comprises a clip edge AI module 160, an inference validator module 165, an outlier management module 170, and edge LCM module 175. The edge AI module 160, comprises of artificial intelligence model and functionality for performing detections on patterns. The functioning of each of the modules is explained in detail below with reference to FIGS. 2A and 2B, in conjunction with FIGS. 1A and 1B.

[00034] Referring to FIG. 1C, functional blocks of the second processor 135 of server system 115 is shown, in accordance with one embodiment of the present disclosure. The second processor 135 comprises a outlier receiver module 180, an outlier store module 185, a delta compiler module 190, a data store module 195, a model rebuild module 200 and model LCM module 205. The functioning of each of the modules is explained in detail below with reference to FIGS. 3A, in conjunction with FIGS. 1A and 1C.

[0035] Referring to FIGS. 2A and 2B, in conjunction with FIGS. 1A and 1B, a method 300 for identifying artificial intelligence outliers and uploading to server system 115 is described, further receiving updated model from server system 115 is described, in accordance with one embodiment of the present disclosure. The method 300 may be described in the general context of computer executable instructions. Generally, computer executable instructions may include routines, programs, objects, components, data structures, procedures, modules, functions, etc., that perform particular functions or implement particular abstract data types. The method 300 may also be practiced in a distributed computing environment where functions are performed by remote processing devices that are linked through the network 150. In a distributed computing environment, computer executable instructions may be located in both local and remote computer storage media, including memory storage devices.

[0036] Referring to FIG. 3A, in conjunction with FIGS. 1A and 1C, a method 400 for generating updated artificial intelligence model from outliers received from edge system 105 is described, further making updated model available for downloading by edge system 105 is described, in accordance with one embodiment of the present disclosure. The method 400 may be described in the general context of computer executable instructions. Generally, computer executable instructions may include routines, programs, objects, components, data structures, procedures, modules, functions, etc., that perform particular functions or implement particular abstract data types. The method 400 may also be practiced in a distributed computing environment where functions are performed by remote processing devices that are linked through the network 150. In a distributed computing environment, computer executable instructions may be located in both local and remote computer storage media, including memory storage devices.

[0037] The order in which the method 300 is described and is not intended to be construed as a limitation, and any number of the described method blocks can be combined in any order to implement the method 300 or alternate methods. Additionally, individual blocks may be deleted from the method 300 without departing from the scope of the disclosure described herein. Furthermore, the method may be implemented in any suitable hardware, software, firmware, or combination thereof. However, for ease of explanation, in the embodiments described below, the method 300 may be implemented in the above-described edge system 105.

[0038] The order in which the method 400 is described and is not intended to be construed as a limitation, and any number of the described method blocks can be combined in any order to implement the method 400 or alternate methods. Additionally, individual blocks may be deleted from the method 400 without departing from the scope of the disclosure described herein. Furthermore, the method may be implemented in any suitable hardware, software, firmware, or combination thereof. However, for ease of explanation, in the embodiments described below, the method 400 may be implemented in the above-described server system 115.

[00039] Referring to FIG. 2A, at step 305, the edge AI module 160 receives a clip from the audio video source 110.

[00040] At step 310, the edge AI module 160 does pattern recognition on clip received from audio video source 110. In one example, pattern recognition involves classification object in video clip with attributes like object type, relative position, confidence level. In one example, pattern recognition involves classification on audio with attributes like decibel level, stress level, loudness, gender, pitch and so on.

[00041] At step 315, hybrid edge artificial intelligence features status enabled or disabled is verified before further processing. If feature is disabled, further processing and hybrid mode is does not function. If feature is enabled, further processing by inference validator module 165 shall happen on output from edge AI module 160.

[00042] At step 320, the inference validator module 165 determines whether outliers are found in pattern recognition. Outlier detection is done through various methods. In one example, outliers are detected is high variance is observed in detection confidence. In one example, outliers are detected if single fame or instant having detection whereas preceding or succeeding frames or instances does not provide detections on pattern recognitions.

[00043] At step 320, the inference validator module 165 determines whether outliers are found in pattern recognition, in conjunction with field references (step 325). Static information of video source 110 is stored in filed references for cross verification of pattern recognitions. In one implementation, relative position is video source 110 is utilized for determine wrong pattern recognition. For example, inside video source cannot have vehicle recognition from artificial intelligence module. In one implementation, order of video source 110 is utilized determine wrong detection. For example, object or sound cannot pop up in inside video source 110 if the same has not occurred in outside or passage video source 110.

[00044] At step 330, the outlier management module 170, collects all outlier information and forms outlier package. The outlier package is uploaded to remote server system 115 over network 150.

[00045] At step 335, if network 150 fails or remote server system 115 did not respond to outlier upload, a repeated attempt shall be made to retransmit outliers to server system 115.

[00046] Referring to FIG. 2B, at step 340, the edge LCM module 175, connects with remote server system 115, retrieves details of detailed of available AI model information at remote server system 115.

[00047] At step 345, the implementation verifies if available artificial intelligence model is newer, then further update of edge artificial intelligence module 160 is done (step 350), else no action is taken.

[00048] At step 405, the outlier receiver module 180, receives outliers from edge system 105 over network 150. Received outliers are stored in the outlier store module 185.

[00049] Referring to FIG. 3A, at step 410, the delta compiler module 190, takes collected outliers in the outlier store module 185, and merges them with existing data in the data store module 195. In one implementation, outliers in outlier store 185 are separated for audio or video and merged accordingly with respective data store 195. In one implementation, outlies in outlier store are verified by data quality and if low quality is found, they are discarded. For example, video or image clarity is verified to decide to further use or discard received outlier. For example, noise level n audio or video is processed to ensure to further process or discard received outlier.

[00050] At step 415, the model rebuild module 200, triggers a rebuild of updated model with rebuild data from the data store module 195. In one implementation, model rebuild is done using distributed systems. In one implementation, model rebuild is done in single system.

[00051] At step 420, the model LCM module 205, holds updated artificial intelligence model for further download by the edge module LCM 175.

[0052] While specific language has been used to describe the disclosure, any limitations arising on account of the same are not intended. As would be apparent to a person skilled in the art, various working modifications may be made to the method in order to implement the inventive concept as taught herein.

[0053] The figures and the foregoing description give examples of embodiments. Those skilled in the art will appreciate that one or more of the described elements may well be combined into a single functional element. Alternatively, certain elements may be split into multiple functional elements. Elements from one embodiment may be added to another embodiment. For example, orders of processes described herein may be changed and are not limited to the manner described herein. Moreover, the actions of any flow diagram need not be implemented in the order shown; nor do all of the acts necessarily need to be performed. Also, those acts that are not dependent on other acts may be performed in parallel with the other acts. The scope of embodiments is by no means limited by these specific examples. Numerous variations, whether explicitly given in the specification or not, such as differences in structure, dimension, and use of material, are possible.

Documents

Application Documents

# Name Date
1 201941033523-EVIDENCE FOR REGISTRATION UNDER SSI [01-04-2025(online)].pdf 2025-04-01
1 201941033523-FORM FOR STARTUP [20-08-2019(online)].pdf 2019-08-20
2 201941033523-FORM FOR SMALL ENTITY [01-04-2025(online)].pdf 2025-04-01
2 201941033523-FORM FOR SMALL ENTITY(FORM-28) [20-08-2019(online)].pdf 2019-08-20
3 201941033523-FORM-27 [11-09-2024(online)].pdf 2024-09-11
3 201941033523-FORM 1 [20-08-2019(online)].pdf 2019-08-20
4 201941033523-RELEVANT DOCUMENTS [23-08-2022(online)].pdf 2022-08-23
4 201941033523-FIGURE OF ABSTRACT [20-08-2019(online)].pdf 2019-08-20
5 201941033523-RELEVANT DOCUMENTS [20-08-2022(online)].pdf 2022-08-20
5 201941033523-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [20-08-2019(online)].pdf 2019-08-20
6 201941033523-RELEVANT DOCUMENTS [15-11-2020(online)].pdf 2020-11-15
6 201941033523-DRAWINGS [20-08-2019(online)].pdf 2019-08-20
7 201941033523-Correspondence_Startup_03-07-2020.pdf 2020-07-03
7 201941033523-COMPLETE SPECIFICATION [20-08-2019(online)].pdf 2019-08-20
8 201941033523-FORM-9 [21-08-2019(online)].pdf 2019-08-21
8 201941033523-AMENDED DOCUMENTS [23-06-2020(online)].pdf 2020-06-23
9 201941033523-FORM 13 [23-06-2020(online)].pdf 2020-06-23
9 201941033523-FORM 18A [21-08-2019(online)].pdf 2019-08-21
10 201941033523-FORM FOR STARTUP [23-06-2020(online)].pdf 2020-06-23
10 201941033523-Proof of Right (MANDATORY) [28-08-2019(online)].pdf 2019-08-28
11 201941033523-FORM-26 [28-08-2019(online)].pdf 2019-08-28
11 201941033523-OTHERS [23-06-2020(online)].pdf 2020-06-23
12 201941033523-FORM 3 [28-08-2019(online)].pdf 2019-08-28
12 201941033523-RELEVANT DOCUMENTS [23-06-2020(online)].pdf 2020-06-23
13 201941033523-Abstract_Granted 338305_10-06-2020.pdf 2020-06-10
13 201941033523-ENDORSEMENT BY INVENTORS [28-08-2019(online)].pdf 2019-08-28
14 201941033523-Claims_Granted 338305_10-06-2020.pdf 2020-06-10
14 Correspondence by Agent_Startup_05-09-2019.pdf 2019-09-05
15 201941033523-Description_Granted 338305_10-06-2020.pdf 2020-06-10
15 Correspondence by Agent_Form1,Form5_05-09-2019.pdf 2019-09-05
16 201941033523-Drawings_Granted 338305_10-06-2020.pdf 2020-06-10
16 201941033523-FER.pdf 2019-10-15
17 201941033523-OTHERS [05-12-2019(online)].pdf 2019-12-05
17 201941033523-IntimationOfGrant10-06-2020.pdf 2020-06-10
18 201941033523-FER_SER_REPLY [05-12-2019(online)].pdf 2019-12-05
18 201941033523-Marked up Claims_Granted 338305_10-06-2020.pdf 2020-06-10
19 201941033523-COMPLETE SPECIFICATION [05-12-2019(online)].pdf 2019-12-05
19 201941033523-PatentCertificate10-06-2020.pdf 2020-06-10
20 201941033523-AMENDED DOCUMENTS [29-03-2020(online)].pdf 2020-03-29
20 201941033523-CLAIMS [05-12-2019(online)].pdf 2019-12-05
21 201941033523-Annexure [29-03-2020(online)].pdf 2020-03-29
21 201941033523-HearingNoticeLetter-(DateOfHearing-21-02-2020).pdf 2020-02-07
22 201941033523-FORM 13 [29-03-2020(online)].pdf 2020-03-29
22 201941033523-FORM 4(ii) [17-02-2020(online)].pdf 2020-02-17
23 201941033523-FORM 3 [29-03-2020(online)].pdf 2020-03-29
23 201941033523-REQUEST FOR ADJOURNMENT OF HEARING UNDER RULE 129A [19-02-2020(online)].pdf 2020-02-19
24 201941033523-Response to office action [21-02-2020(online)].pdf 2020-02-21
24 201941033523-MARKED COPIES OF AMENDEMENTS [29-03-2020(online)].pdf 2020-03-29
25 201941033523-ExtendedHearingNoticeLetter-(DateOfHearing-19-03-2020).pdf 2020-02-25
25 201941033523-Written submissions and relevant documents [29-03-2020(online)].pdf 2020-03-29
26 201941033523-ExtendedHearingNoticeLetter-(DateOfHearing-19-03-2020).pdf 2020-02-25
26 201941033523-Written submissions and relevant documents [29-03-2020(online)].pdf 2020-03-29
27 201941033523-MARKED COPIES OF AMENDEMENTS [29-03-2020(online)].pdf 2020-03-29
27 201941033523-Response to office action [21-02-2020(online)].pdf 2020-02-21
28 201941033523-FORM 3 [29-03-2020(online)].pdf 2020-03-29
28 201941033523-REQUEST FOR ADJOURNMENT OF HEARING UNDER RULE 129A [19-02-2020(online)].pdf 2020-02-19
29 201941033523-FORM 13 [29-03-2020(online)].pdf 2020-03-29
29 201941033523-FORM 4(ii) [17-02-2020(online)].pdf 2020-02-17
30 201941033523-Annexure [29-03-2020(online)].pdf 2020-03-29
30 201941033523-HearingNoticeLetter-(DateOfHearing-21-02-2020).pdf 2020-02-07
31 201941033523-AMENDED DOCUMENTS [29-03-2020(online)].pdf 2020-03-29
31 201941033523-CLAIMS [05-12-2019(online)].pdf 2019-12-05
32 201941033523-COMPLETE SPECIFICATION [05-12-2019(online)].pdf 2019-12-05
32 201941033523-PatentCertificate10-06-2020.pdf 2020-06-10
33 201941033523-FER_SER_REPLY [05-12-2019(online)].pdf 2019-12-05
33 201941033523-Marked up Claims_Granted 338305_10-06-2020.pdf 2020-06-10
34 201941033523-IntimationOfGrant10-06-2020.pdf 2020-06-10
34 201941033523-OTHERS [05-12-2019(online)].pdf 2019-12-05
35 201941033523-FER.pdf 2019-10-15
35 201941033523-Drawings_Granted 338305_10-06-2020.pdf 2020-06-10
36 201941033523-Description_Granted 338305_10-06-2020.pdf 2020-06-10
36 Correspondence by Agent_Form1,Form5_05-09-2019.pdf 2019-09-05
37 201941033523-Claims_Granted 338305_10-06-2020.pdf 2020-06-10
37 Correspondence by Agent_Startup_05-09-2019.pdf 2019-09-05
38 201941033523-Abstract_Granted 338305_10-06-2020.pdf 2020-06-10
38 201941033523-ENDORSEMENT BY INVENTORS [28-08-2019(online)].pdf 2019-08-28
39 201941033523-FORM 3 [28-08-2019(online)].pdf 2019-08-28
39 201941033523-RELEVANT DOCUMENTS [23-06-2020(online)].pdf 2020-06-23
40 201941033523-FORM-26 [28-08-2019(online)].pdf 2019-08-28
40 201941033523-OTHERS [23-06-2020(online)].pdf 2020-06-23
41 201941033523-FORM FOR STARTUP [23-06-2020(online)].pdf 2020-06-23
41 201941033523-Proof of Right (MANDATORY) [28-08-2019(online)].pdf 2019-08-28
42 201941033523-FORM 13 [23-06-2020(online)].pdf 2020-06-23
42 201941033523-FORM 18A [21-08-2019(online)].pdf 2019-08-21
43 201941033523-AMENDED DOCUMENTS [23-06-2020(online)].pdf 2020-06-23
43 201941033523-FORM-9 [21-08-2019(online)].pdf 2019-08-21
44 201941033523-COMPLETE SPECIFICATION [20-08-2019(online)].pdf 2019-08-20
44 201941033523-Correspondence_Startup_03-07-2020.pdf 2020-07-03
45 201941033523-RELEVANT DOCUMENTS [15-11-2020(online)].pdf 2020-11-15
45 201941033523-DRAWINGS [20-08-2019(online)].pdf 2019-08-20
46 201941033523-RELEVANT DOCUMENTS [20-08-2022(online)].pdf 2022-08-20
46 201941033523-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [20-08-2019(online)].pdf 2019-08-20
47 201941033523-RELEVANT DOCUMENTS [23-08-2022(online)].pdf 2022-08-23
47 201941033523-FIGURE OF ABSTRACT [20-08-2019(online)].pdf 2019-08-20
48 201941033523-FORM-27 [11-09-2024(online)].pdf 2024-09-11
48 201941033523-FORM 1 [20-08-2019(online)].pdf 2019-08-20
49 201941033523-FORM FOR SMALL ENTITY(FORM-28) [20-08-2019(online)].pdf 2019-08-20
49 201941033523-FORM FOR SMALL ENTITY [01-04-2025(online)].pdf 2025-04-01
50 201941033523-EVIDENCE FOR REGISTRATION UNDER SSI [01-04-2025(online)].pdf 2025-04-01
50 201941033523-FORM FOR STARTUP [20-08-2019(online)].pdf 2019-08-20

Search Strategy

1 201941033523_03-10-2019.pdf

ERegister / Renewals

3rd: 26 Jul 2020

From 20/08/2021 - To 20/08/2022

4th: 26 Jul 2020

From 20/08/2022 - To 20/08/2023

5th: 26 Jul 2020

From 20/08/2023 - To 20/08/2024

6th: 26 Jul 2020

From 20/08/2024 - To 20/08/2025

7th: 02 Apr 2025

From 20/08/2025 - To 20/08/2026

8th: 02 Apr 2025

From 20/08/2026 - To 20/08/2027

9th: 02 Apr 2025

From 20/08/2027 - To 20/08/2028

10th: 02 Apr 2025

From 20/08/2028 - To 20/08/2029

11th: 02 Apr 2025

From 20/08/2029 - To 20/08/2030