Sign In to Follow Application
View All Documents & Correspondence

A Content Annotation System And Method Thereof

Abstract: The present invention is related to a content annotation system. The system provides video output quality as same as input video quality comes from one or more sources. The system (100) includes an input module (106) that receives at least one input stream from one or more sources. An amplifier (108) amplifies the input stream. A synchronization separator (110) extracts synchronization signals from the amplified stream, and generate extracted data. A processing unit processes the extracted data and determines a digital symbol. A first converter (114) converts the digital symbol to an analog symbol. A second converter (116) converts the analog symbol to a composite symbol. A multiplexer (118) overlays the composite symbol over the amplified stream, and generates an analog output video.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
26 March 2019
Publication Number
40/2020
Publication Type
INA
Invention Field
ELECTRONICS
Status
Email
info@krishnaandsaurastri.com
Parent Application
Patent Number
Legal Status
Grant Date
2024-11-14
Renewal Date

Applicants

Bharat Electronics Limited
Outer Ring Road, Nagavara, Bangalore, Karnataka - 560045, India.

Inventors

1. Ankit Sohni
Central Research Laboratory, Bharat Electronics Limited, Jalahalli P.O, Bangalore, Karnataka - 560013.
2. Vijay Baragur
Central Research Laboratory, Bharat Electronics Limited, Jalahalli P.O. Bangalore, Karnataka - 560013.
3. Virendra Kumar Mittal
Central Research Laboratory, Bharat Electronics Limited, Jalahalli P.O. Bangalore, Karnataka - 560013.
4. Santosh Kumar
Central Research Laboratory, Bharat Electronics Limited, Jalahalli P.O. Bangalore, Karnataka - 560013.
5. Chaveli Ramesh
Central Research Laboratory, Bharat Electronics Limited, Jalahalli P.O. Bangalore, Karnataka - 560013.

Specification

DESC:TECHNICAL FIELD
The present invention relates generally to processing systems. The present invention, more particularly, relates to a content annotation system.

BACKGROUND
Digitally stored symbols are overlaid on an analog video, which does not involve the analog video to be digitized. Further, digitization of the analog video by a video analog-to-digital convertor (ADC) can introduce distortion and quantization noise in an input video.

US5541666A titled “Method & Apparatus for overlaying digitally generated graphics over an analog video signal” discloses overlaying of digital character signals on an analog video source signal, which requires predetermined color subcarrier. The subcarrier includes a sub carrier phase lock loop, a digital character generating device, a digital video encoder and a switching device. The digital character generating device detects horizontal and vertical timing of pixel information in the analog video source signal. The digitally video encoder is responsive to the color subcarrier and system clock signals for generating a separate color subcarrier which is locked to the color subcarrier of the analog video source signal. This method required color subcarrier and phase lock loop to realize the system which can increase the complexity of the system.

US5583536A titled “Method & Apparatus for analog video merging and key detection” discloses annotating video content with metadata generated using speech recognition technology, which requires a summing circuit for summing an overlay signal and a default signal to generate a composite signal. The video mixer includes a comparator that has a first input for receiving the composite signal and a second input for receiving the overlay signal. The comparator is used for signal level comparison between signal available at the first input and the second input. The comparator signal output is used to generate enable signal which provides to the comparator in presence of default key color signal.
US5696527A titled “Multimedia overlay system for Graphics and Video” discloses RGB (Red-Green-Blue), and Video Graphics Array (VGA) signal format are applied at one of the inputs of a multiplexer (MUX). The digital video data stored in a video memory buffer on a video card is converted to RGB analog signals using a digital to analog converter and those analog signals are applied to a second input of the analog multiplexer. Specifically, the patent number US5696527A is based on color key detection using a comparator for detection. When the comparator detects a color key, the analog video is passed through the multiplexer for display, else VGA video signal is displayed. Hence, this method is working on color key detection methods which require analog comparator, thereby increasing the complexity and a number of components in the system.

Hence, there is a need of a system which solves the above defined problems and provides a content annotation system which removes the need of a video analog to digital converter, and provides video output quality as same as input video quality comes from one or more sources.

SUMMARY
This summary is provided to introduce concepts related to a content annotation system and method thereof. This summary is neither intended to identify essential features of the present invention nor is it intended for use in determining or limiting the scope of the present invention.

For example, various embodiments herein may include one or more content annotation systems and methods thereof are provided. In one of the embodiments, a method for annotating content includes a step of receiving, by an input module, at least one input stream from one or more sources. The method includes a step of amplifying, by an amplifier, the input stream. The method includes a step of extracting, by a synchronization separator, synchronization signals from the amplified stream, and generating extracted data. The method includes a step of processing, by a processing unit, the extracted data and determining at least one digital symbol. The method includes a step of converting, by a first converter, the digital symbol to an analog symbol. The method includes a step of converting, by a second converter, the analog symbol to a composite symbol. The method includes a step of overlaying, by a multiplexer, the composite symbol over the amplified stream. The method includes a step of generating, by the multiplexer, an analog output video based on the overlaid symbol.

In another embodiment, a content annotation system includes a memory, a processor, an input module, an amplifier, a synchronization separator, a processing unit, a first converter, a second converter, and a multiplexer. The memory is configured to store pre-defined rules. The processor is configured to generate system processing commands based on the pre-defined rules. The input module is configured to receive at least one input stream from one or more sources. The amplifier is configured to amplify the input stream. The synchronization separator is configured to extract synchronization signals from the amplified stream, and generate extracted data. The processing unit is configured to process the extracted data and determine at least one digital symbol. The first converter is configured to convert the digital symbol to an analog symbol. The second converter is configured to convert the analog symbol to a composite symbol. The multiplexer is configured to overlay the composite symbol over the amplified stream, and generate an analog output video.

BRIEF DESCRIPTION OF ACCOMPANYING DRAWINGS
The detailed description is described with reference to the accompanying figures. In the figures, the left-most digit(s) of a reference number identifies the figure in which the reference number first appears. The same numbers are used throughout the drawings to reference like features and modules.

Figure 1 illustrates a block diagram depicting a content annotation system, according to an exemplary implementation of the present invention.

Figure 2 illustrates a schematic diagram depicting a workflow of the content annotation system of Figure 1, according to an exemplary implementation of the present invention.

Figure 3 illustrates a flowchart depicting a method for annotating content, according to an exemplary implementation of the present invention.

It should be appreciated by those skilled in the art that any block diagrams herein represent conceptual views of illustrative systems embodying the principles of the present invention. Similarly, it will be appreciated that any flowcharts, flow diagrams, and the like represent various processes which may be substantially represented in computer readable medium and so executed by a computer or processor, whether or not such computer or processor is explicitly shown.

DETAILED DESCRIPTION
In the following description, for the purpose of explanation, specific details are set forth in order to provide an understanding of the present invention. It will be apparent, however, to one skilled in the art that the present invention may be practiced without these details. One skilled in the art will recognize that embodiments of the present invention, some of which are described below, may be incorporated into a number of systems.

The various embodiments of the present invention provide a content annotation system and method thereof.

Furthermore, connections between components and/or modules within the figures are not intended to be limited to direct connections. Rather, these components and modules may be modified, re-formatted or otherwise changed by intermediary components and modules.

References in the present invention to “one embodiment” or “an embodiment” mean that a particular feature, structure, characteristic, or function described in connection with the embodiment is included in at least one embodiment of the invention. The appearances of the phrase “in one embodiment” in various places in the specification are not necessarily all referring to the same embodiment.

In one of the embodiments, a method for annotating content includes a step of receiving, by an input module, at least one input stream from one or more sources. The method includes a step of amplifying, by an amplifier, the input stream. The method includes a step of extracting, by a synchronization separator, synchronization signals from the amplified stream, and generating extracted data. The method includes a step of processing, by a processing unit, the extracted data and determining at least one digital symbol. The method includes a step of converting, by a first converter, the digital symbol to an analog symbol. The method includes a step of converting, by a second converter, the analog symbol to a composite symbol. The method includes a step of overlaying, by a multiplexer, the composite symbol over the amplified stream. The method includes a step of generating, by the multiplexer, an analog output video based on the overlaid symbol.

In another implementation, the method includes storing, in a database, a plurality of digital symbols.

In another implementation, processing the extracted data includes estimating synchronization duration and pixel position of the digital symbol for determining the digital symbol.

In another implementation, estimating the pixel position using the synchronization signals includes counting a number of clocks in the synchronization duration.
In another implementation, the method includes converting, by the first converter, the digital symbol to analog RGB (Red, Green, Blue) symbol.

In another implementation, generating, by the multiplexer, a control command for generating the analog output video using the analog symbol and the amplified stream.

In another embodiment, a content annotation system includes a memory, a processor, an input module, an amplifier, a synchronization separator, a processing unit, a first converter, a second converter, and a multiplexer. The memory is configured to store pre-defined rules. The processor is configured to generate system processing commands based on the pre-defined rules. The input module is configured to receive at least one input stream from one or more sources. The amplifier is configured to amplify the input stream. The synchronization separator is configured to extract synchronization signals from the amplified stream, and generate extracted data. The processing unit is configured to process the extracted data and determine at least one digital symbol. The first converter is configured to convert the digital symbol to an analog symbol. The second converter is configured to convert the analog symbol to a composite symbol. The multiplexer is configured to overlay the composite symbol over the amplified stream, and generate an analog output video.

In another implementation, the system includes a control signal generator. The control signal generator is configured to control the multiplexer using the pre-defined rules.

In another implementation, the system includes a database configured to store a plurality of digital symbols.

In another implementation, the processing unit is configured to estimate synchronization duration and pixel position of the digital symbol for determining the digital symbol.

In another implementation, the processing unit is configured to estimate the pixel position using the synchronization signals by counting a number of clocks in the synchronization duration.

In another implementation, the first convertor is a digital to analog converter (DAC), configured to convert the digital symbol to analog RGB (Red, Green, Blue) symbol.

In another implementation, the second converter is a composite video converter.

In another implementation, the multiplexer is configured to generate a control command and generate the analog output video using the analog symbol and the amplified stream.

In an embodiment, the content annotation system provides a digital symbol overlay over an analog video using a processing unit for video synchronization duration estimation and pixel position estimation using synchronization as a reference. The content annotation system helps to preserve video output quality same as input video quality comes from a source (for example, a camera), and video symbols is over lay on the analog video. In one embodiment, the digitally stored symbols are overlaid on the analog video which does not involve analog video to be digitized. Moreover, the digitization of analog video by a video ADC (Analog to digital converter) introduces distortion and quantization noise in input video. The content annotation system removes the need of the video ADC which can introduce quantization noise in the output video.
In an embodiment, the content annotation system provides digital symbol overlay over analog video signal in an analog domain. The digital symbol overlay can be done in a digital domain, which can introduce distortion in video due to quantization noise introduced by the video ADC. The quality of video is degraded if the video is digitized due to quantization noise introduced by the video ADC. To avoid quantization noise effect, an analog symbol overlay is provided. In this, pixel position is detected with help of video synchronization and a system clock by counting number of clocks interval during entire synchronization duration. The digital symbols are stored in digital format, which are overlaid or annotated over analog video to maintain the quality of the analog video.

Figure 1 illustrates a block diagram depicting a content annotation system, according to an exemplary implementation of the present invention.

A content annotation system (hereinafter referred to as “system”) (100) includes a memory (102), a processor (104), an input module (106), an amplifier (108), a synchronization separator (110), a processing unit (112), a first converter (114), a second converter (116), and a multiplexer (118).

The memory (102) is configured to store pre-determined rules related to annotating content of video, controlling modules, conversion, overlaying data, and amplifying data. The memory (102) is also configured to store network related data. In an embodiment, the memory (102) can include any computer-readable medium known in the art including, for example, volatile memory, such as static random-access memory (SRAM) and dynamic random-access memory (DRAM), and/or non-volatile memory, such as read only memory (ROM), erasable programmable ROM, flash memories, hard disks, optical disks, and magnetic tapes. The memory (102) also includes a cache memory to work with the system (100) more effectively.

The processor (104) is configured to cooperate with the memory (102) to receive the pre-determined rules. The processor (104) is further configured to generate system processing commands. In an embodiment, the processor (104) may be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, state machines, logic circuitries, and/or any devices that manipulate signals based on operational instructions. Among other capabilities, the at least one processor (104) is configured to fetch the pre-determined rules from the memory (102) and execute different modules of the system (100).

The input module (106) is configured to receive at least one input stream from one or more sources. In an embodiment, the input stream can be video stream, audio stream, and combinations thereof. In another embodiment, the one or more sources include a camera, a recorder, and other similar devices. In one embodiment, the input module (106) is configured to receive analog input video stream from a source.

The amplifier (108) is configured to cooperate with the input module (106) to receive the input stream. The amplifier (108) is configured to amplify the input stream. In an embodiment, the amplifier (108) is configured to amplify the input stream by dividing the input stream in two parts, i.e. a first part transmits to the synchronization separator (110) and the second part transmits to the multiplexer (118). In one embodiment, the synchronization separator (1110) is configured to amplify the input stream of video in order to pass the amplified video to the synchronization separator (110) and the multiplexer (118) to avoid distortion.

The synchronization separator (110) is configured to cooperate the amplifier (108) to receive the amplified stream. The synchronization separator (!10) is configured to extract synchronization signals from the amplified stream, and generate extracted data. In an embodiment, the extracted data includes horizontal and vertical synchronization data. In one embodiment, the synchronization separator (110) is configured to extract video synchronization signals for knowing the pixel position for video overlaying.
The processing unit (112) is configured to cooperate with the synchronization separator (110) to receive the extracted data. The processing unit (112) is configured to process the extracted data and determine at least one digital symbol. In an embodiment, the processing unit (112) is configured to estimate synchronization duration and pixel position of the digital symbol for determining the digital symbol. In another embodiment, the processing unit (112) is configured to estimate the pixel position using the synchronization signals by counting a number of clocks in the synchronization duration.

The first converter (114) is configured to cooperate with the processing unit (112) to receive the determined digital symbol. The first converter (114) is configured to convert the digital symbol into an analog symbol. In an embodiment, the first converter (114) is a digital to analog converter (DAC), which is configured to convert the digital symbol to analog RGB (Red, Green, Blue) symbol.

The second converter (116) is configured to cooperate with the first converter (114) to receive the analog symbol. The second converter (116) is configured to convert the analog symbol to a composite symbol. In an embodiment, the second converter (116) is a composite video converter, which is configured to convert the RGB symbol to the composite symbol.

The multiplexer (118) is configured to cooperate with the amplifier and the second converter (116) to receive the amplified stream and the composite symbol, respectively. The multiplexer (118) is configured to overlay the composite symbol over the amplified stream, and generate an analog output video. In an embodiment, the multiplexer (118) is configured to generate an annotated analog output video, and display on a display unit (not shown in a figure). The display unit can be a display screen. In one embodiment, the multiplexer (118) is configured to generate a control command and generate the analog output video using the analog symbol and the amplified stream.

In an embodiment, the system (100) includes a control signal generator (120). The control signal generator (120) is configured to control the multiplexer (118) using the pre-defined rules.

In an embodiment, the system (100) includes a database (122). The database (122) is configured to store a plurality of digital symbols for future use. In an embodiment, the database (122) can be implemented as, but is not limited to, an enterprise database, a remote database, a local database, and the like. In one embodiment, the database (122) may themselves be located either within the vicinity of each other or may be located at different geographic locations. In another embodiment, the database (122) can be implemented inside or outside the system (100) and the database (122) can be implemented as a single database.

Figure 2 illustrates a schematic diagram (200) depicting a workflow of the content annotation system of Figure 1, according to an exemplary implementation of the present invention.

In Figure 2, an amplifier (108) is configured to receive an analog input video stream from a source. The amplifier (108) is configured to amplify the analog input video stream. The amplifier (108) is configured to amplify the analog video input stream by dividing the input stream in two parts, i.e. one part transmits to a synchronization separator (110) and the second part transmits to a multiplexer (118). The synchronization separator (!10) is configured to extract synchronization signals from the amplified analog input video stream, and generate extracted data. In an embodiment, the extracted data includes horizontal and vertical synchronization data. The processing unit (112) is configured to process the extracted data and determine at least one digital symbol. The processing unit (112) is configured to estimate synchronization duration and pixel position of the digital symbol for determining the digital symbol. The digital symbol is then transmitted to a first converter and the multiplexer (118). The first converter (114) is a digital to analog converter (DAC), which is configured to convert the digital symbol to analog RGB (Red, Green, Blue) symbol. The RGB symbol is then transmitted to a second converter (116). The second converter (116) is a composite video converter, which is configured to convert the RGB symbol to a composite symbol. The composite symbol signal then transmitted to the multiplexer (118). In an embodiment, the multiplexer is an analog multiplexer (118). The multiplexer (118) is configured to overlay the composite symbol over the amplified stream, and generate an analog output video, or an annotated analog output video.

In an embodiment, the system (100) is pertaining to digital symbol overlay over the analog video using the processing unit (112) for video synchronization duration estimation and pixel position estimation using synchronization as reference. The pixel position using video synchronization as a reference is estimated as running a clock of duration.
Clock width=S_d/N ….. (1)
where, N is Number of clocks and S_d is synchronization duration.

The number of clocks is counted in synchronization width duration, which gives the pixel position. An estimated pixel positions where the digital symbols are displayed, retrieved from the database (122). In an embodiment, the analog composite video is sent to one or more input devices by using the amplifier (108), i.e. one for the multiplexer (118) and another for the synchronization separator (110) to avoid video distortion. One video output comes from the amplifier (108) is sent to the multiplexer (118), i.e. composite analog signal, and another video is sent to the synchronization separator (110) or the processing unit (112). The processing unit (112) is configured to estimate pixel position using synchronization width as reference and a clock counter value. The number of during video synchronization duration is continuously counted. The retrieved symbols from the database (122) are sent to the first converter (114). The first converter (114) is configured to generate three videos (R, G, B) components. To make R, G, B analog video components, the composite video RGB to the second converter (116) compliance is required in a loop. Subsequently, the composite video receives from the amplifier (108) and the digital symbol is transmitted to the multiplexer (118). The opening and closing of the multiplexer (118) are controlled by the processing unit (112), based upon estimated pixel position where the analog symbol is to be displayed on the display unit.

Figure 3 illustrates a flowchart (300) depicting a method for annotating content, according to an exemplary implementation of the present invention.

The flowchart (300) starts at a step (302), receiving, by an input module (106), at least one input stream from one or more sources. In an embodiment, an input module (106) is configured to receive at least one input stream from one or more sources. At a step (304), amplifying, by an amplifier (108), the input stream. In an embodiment, an amplifier (108) is configured amplify the input stream. At a step (306), extracting, by a synchronization separator (110), synchronization signals from the amplified stream, and generating extracted data. In an embodiment, a synchronization separator (110) is configured to extract synchronization signals from the amplified stream, and generate extracted data. At a step (308), processing, by a processing unit (112), the extracted data and determining at least one digital symbol. In an embodiment, a processing unit (112) is configured to process the extracted data, and determine at least one digital symbol. At a step (310), converting, by a first converter (114), the digital symbol to an analog symbol. In an embodiment, a first convertor (114) is configured to convert the digital symbol to an analog symbol. At a step (312), converting, by a second converter (116), the analog symbol to a composite symbol. In an embodiment, a second converter (116) is configured to convert the analog symbol to a composite symbol. At a step (314), overlaying, by a multiplexer (118), the composite symbol over the amplified stream. In an embodiment, a multiplexer (118) is configured to overlay the composite symbol over the amplified stream. At a step (316), generating, by the multiplexer (118), an analog output video based on the overlaid symbol. In an embodiment the multiplexer (118) is configured to generate an analog output video based on the overlaid symbol.

It should be noted that the description merely illustrates the principles of the present invention. It will thus be appreciated that those skilled in the art will be able to devise various arrangements that, although not explicitly described herein, embody the principles of the present invention. Furthermore, all examples recited herein are principally intended expressly to be only for explanatory purposes to help the reader in understanding the principles of the invention and the concepts contributed by the inventor to furthering the art, and are to be construed as being without limitation to such specifically recited examples and conditions. Moreover, all statements herein reciting principles, aspects, and embodiments of the invention, as well as specific examples thereof, are intended to encompass equivalents thereof.
,CLAIMS:
1. A method for annotating content, said method comprising:
receiving, by an input module (106), at least one input stream from one or more sources;
amplifying, by an amplifier (108), said input stream;
extracting, by a synchronization separator (110), synchronization signals from said amplified stream, and generating extracted data;
processing, by a processing unit (112), said extracted data, and determining at least one digital symbol;
converting, by a first converter (114), said digital symbol to an analog symbol;
converting, by a second converter (116), said analog symbol to a composite symbol;
overlaying, by a multiplexer (118), said composite symbol over said amplified stream; and
generating, by said multiplexer (118), an analog output video based on said overlaid symbol.

2. The method as claimed in claim 1, wherein said method includes storing, in a database (122), a plurality of digital symbols.

3. The method as claimed in claim 1, wherein processing said extracted data includes estimating synchronization duration and pixel position of said digital symbol for determining said digital symbol.

4. The method as claimed in claim 3, wherein estimating said pixel position using said synchronization signals includes counting a number of clocks in said synchronization duration.

5. The method as claimed in claim 1, wherein said method includes converting, by said first converter, said digital symbol to analog RGB (Red, Green, Blue) symbol.

6. The method as claimed in claim 1, wherein generating, by said multiplexer (118), a control command for generating said analog output video using said analog symbol and said amplified stream.

7. A content annotation system (100) comprising:
a memory (102) configured to store pre-defined rules;
a processor (104) configured to cooperate with said memory (102), said processor (104) configured to generate system processing commands based on said pre-defined rules;
an input module (106) configured to receive at least one input stream from one or more sources;
an amplifier (108) configured to cooperate with said input module (106), said amplifier (108) configured to amplify said input stream;
a synchronization separator (110) configured to cooperate with said amplifier (108), said synchronization separator configured to extract synchronization signals from said amplified stream, and generate extracted data;
a processing unit (112) configured to cooperate with said synchronization separator (110), said processing unit (112) configured to process said extracted data and determine at least one digital symbol;
a first converter (114) configured to cooperate with said processing unit (112), said first converter (114) configured to convert said digital symbol to an analog symbol;
a second converter (116) configured to cooperate with said first converter (114), said second converter (116) configured to convert said analog symbol to a composite symbol; and
a multiplexer (118) configured to cooperate with said second converter (116) and said amplifier (108), said multiplexer (118) configured to overlay said composite symbol over said amplified stream, and generate an analog output video.

8. The system (100) as claimed in claim 7, wherein said system (100) includes a control signal generator (120), said control signal generator (120) is configured to control said multiplexer (118) using said pre-defined rules.

9. The system (100) as claimed in claim 7, wherein said system (100) includes a database (122) configured to store a plurality of digital symbols.

10. The system (100) as claimed in claim 7, wherein said processing unit (112) is configured to estimate synchronization duration and pixel position of said digital symbol for determining said digital symbol.

11. The system (100) as claimed in claim 10, wherein said processing unit (112) is configured to estimate said pixel position using said synchronization signals by counting a number of clocks in said synchronization duration.

12. The system (100) as claimed in claim 7, wherein said first converter (114) is a digital to analog converter (DAC), configured to convert said digital symbol to analog RGB (Red, Green, Blue) symbol.

13. The system (100) as claimed in claim 7, wherein said second converter (116) is a composite video converter.

14. The system (100) as claimed in claim 7, said multiplexer (118) is configured to generate a control command and generate said analog output video using said analog symbol and said amplified stream.

Documents

Orders

Section Controller Decision Date

Application Documents

# Name Date
1 201941011843-IntimationOfGrant14-11-2024.pdf 2024-11-14
1 201941011843-PROVISIONAL SPECIFICATION [26-03-2019(online)].pdf 2019-03-26
2 201941011843-FORM 1 [26-03-2019(online)].pdf 2019-03-26
2 201941011843-PatentCertificate14-11-2024.pdf 2024-11-14
3 201941011843-Response to office action [01-11-2024(online)].pdf 2024-11-01
3 201941011843-DRAWINGS [26-03-2019(online)].pdf 2019-03-26
4 201941011843-FORM-26 [13-06-2019(online)].pdf 2019-06-13
4 201941011843-AMENDED DOCUMENTS [04-10-2024(online)].pdf 2024-10-04
5 Correspondence by Agent_Form26_18-06-2019.pdf 2019-06-18
5 201941011843-FORM 13 [04-10-2024(online)].pdf 2024-10-04
6 201941011843-Proof of Right (MANDATORY) [27-06-2019(online)].pdf 2019-06-27
6 201941011843-POA [04-10-2024(online)].pdf 2024-10-04
7 Correspondence by Agent _Form-1_08-07-2019.pdf 2019-07-08
7 201941011843-Written submissions and relevant documents [19-07-2024(online)].pdf 2024-07-19
8 201941011843-FORM 3 [23-07-2019(online)].pdf 2019-07-23
8 201941011843-Correspondence to notify the Controller [02-07-2024(online)].pdf 2024-07-02
9 201941011843-ENDORSEMENT BY INVENTORS [23-07-2019(online)].pdf 2019-07-23
9 201941011843-US(14)-HearingNotice-(HearingDate-05-07-2024).pdf 2024-06-10
10 201941011843-ABSTRACT [24-06-2022(online)].pdf 2022-06-24
10 201941011843-DRAWING [23-07-2019(online)].pdf 2019-07-23
11 201941011843-CLAIMS [24-06-2022(online)].pdf 2022-06-24
11 201941011843-CORRESPONDENCE-OTHERS [23-07-2019(online)].pdf 2019-07-23
12 201941011843-COMPLETE SPECIFICATION [23-07-2019(online)].pdf 2019-07-23
12 201941011843-COMPLETE SPECIFICATION [24-06-2022(online)].pdf 2022-06-24
13 201941011843-DRAWING [24-06-2022(online)].pdf 2022-06-24
13 201941011843-FORM 18 [24-12-2020(online)].pdf 2020-12-24
14 201941011843-FER.pdf 2022-01-03
14 201941011843-FER_SER_REPLY [24-06-2022(online)].pdf 2022-06-24
15 201941011843-OTHERS [24-06-2022(online)].pdf 2022-06-24
16 201941011843-FER.pdf 2022-01-03
16 201941011843-FER_SER_REPLY [24-06-2022(online)].pdf 2022-06-24
17 201941011843-FORM 18 [24-12-2020(online)].pdf 2020-12-24
17 201941011843-DRAWING [24-06-2022(online)].pdf 2022-06-24
18 201941011843-COMPLETE SPECIFICATION [24-06-2022(online)].pdf 2022-06-24
18 201941011843-COMPLETE SPECIFICATION [23-07-2019(online)].pdf 2019-07-23
19 201941011843-CLAIMS [24-06-2022(online)].pdf 2022-06-24
19 201941011843-CORRESPONDENCE-OTHERS [23-07-2019(online)].pdf 2019-07-23
20 201941011843-ABSTRACT [24-06-2022(online)].pdf 2022-06-24
20 201941011843-DRAWING [23-07-2019(online)].pdf 2019-07-23
21 201941011843-ENDORSEMENT BY INVENTORS [23-07-2019(online)].pdf 2019-07-23
21 201941011843-US(14)-HearingNotice-(HearingDate-05-07-2024).pdf 2024-06-10
22 201941011843-Correspondence to notify the Controller [02-07-2024(online)].pdf 2024-07-02
22 201941011843-FORM 3 [23-07-2019(online)].pdf 2019-07-23
23 201941011843-Written submissions and relevant documents [19-07-2024(online)].pdf 2024-07-19
23 Correspondence by Agent _Form-1_08-07-2019.pdf 2019-07-08
24 201941011843-POA [04-10-2024(online)].pdf 2024-10-04
24 201941011843-Proof of Right (MANDATORY) [27-06-2019(online)].pdf 2019-06-27
25 Correspondence by Agent_Form26_18-06-2019.pdf 2019-06-18
25 201941011843-FORM 13 [04-10-2024(online)].pdf 2024-10-04
26 201941011843-FORM-26 [13-06-2019(online)].pdf 2019-06-13
26 201941011843-AMENDED DOCUMENTS [04-10-2024(online)].pdf 2024-10-04
27 201941011843-Response to office action [01-11-2024(online)].pdf 2024-11-01
27 201941011843-DRAWINGS [26-03-2019(online)].pdf 2019-03-26
28 201941011843-PatentCertificate14-11-2024.pdf 2024-11-14
28 201941011843-FORM 1 [26-03-2019(online)].pdf 2019-03-26
29 201941011843-PROVISIONAL SPECIFICATION [26-03-2019(online)].pdf 2019-03-26
29 201941011843-IntimationOfGrant14-11-2024.pdf 2024-11-14

Search Strategy

1 SearchStrategyE_17-12-2021.pdf

ERegister / Renewals

3rd: 07 Feb 2025

From 26/03/2021 - To 26/03/2022

4th: 07 Feb 2025

From 26/03/2022 - To 26/03/2023

5th: 07 Feb 2025

From 26/03/2023 - To 26/03/2024

6th: 07 Feb 2025

From 26/03/2024 - To 26/03/2025

7th: 19 Mar 2025

From 26/03/2025 - To 26/03/2026