Sign In to Follow Application
View All Documents & Correspondence

Method And System For Measuring End User Video Quality

Abstract: The present application provides a method and system for extracting at least one metrics to measure end user video quality. The invention provides a system and method to extract standardized metrics to measure end user video quality (EUVQ). The invention uses a set of predefined videos to enable extraction of at least one metrics for measuring EUVQ using a monitoring station and monitor bank component wherein the video is stored with a content provider and played over a network including internet and mobile networks. The invention uses video capture using cameras to get videos representing the end user video. Further the invention provides comparing the at least one metrics to measure EUVQ for a plurality of distinct content providers.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
14 April 2016
Publication Number
42/2017
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
ip@legasis.in
Parent Application
Patent Number
Legal Status
Grant Date
2023-07-21
Renewal Date

Applicants

Tata Consultancy Services Limited
Nirmal Building, 9th Floor, Nariman Point, Mumbai-400021, Maharashtra, India

Inventors

1. PRABHAKARAN, Joy Pullat
Tata Consultancy Services Limited, Think Campus, Electronic City, Phase II, Bangalore - 560 100, Karnataka, India
2. K K, Suresh Kumar
Tata Consultancy Services Limited, Think Campus, Electronic City Phase II, Bangalore - 560 100, Karnataka, India
3. PUTTAPPA KARIYAPPA, Manjunath
Tata Consultancy Services Limited, Think Campus, Electronic City, Phase II, Bangalore - 560 100, Karnataka, India

Specification

Claims:1. A method for extracting at least one metrics for measuring end user video quality (EUVQ); said method comprising processor implemented steps of:
transmitting, by a monitoring bank (206), a trigger to initiate playback for at least one predefined video wherein the at least one predefined video is stored with a content distributor, and played over a network;
receiving, by a monitoring station (208), the trigger wherein receiving the trigger results, in playback of the at least one predefined video on at least one playback device such that the at least one predefined video is received from a content distributor;
capturing, by one or more video capture device (210) the at least one predefined video playback, wherein the one or more video capture device is operatively coupled to the monitoring station (208);
capturing, a metadata associated with each of the at least one predefined video by employing one or more image processing techniques using a metadata module (212);
transmitting, to the monitor bank (206), the captured at least one predefined video and the metadata associated with each of the at least one predefined video using a transmission module (214);
processing, by a data processing module (216) the at least one predefined video and the metadata associated with the each of the at least one predefined video to extract at least one of metrics for measuring EUVQ such that the data processing module (216) is operatively coupled with the monitor bank, wherein the at least one metrics are extracted, based on one or more quality parameters comprising a play duration, a total number of frames of the at least one predefined video, a total number of frames rendered, a signal to noise ratio (PSNR), a video startup time and a duration of video played before a connection drop.

2. The method according to claim 1 further comprising
extracting the at least one metrics for measuring EUVQ for the at least one predefined video such that the at least one predefined video is stored with a plurality of distinct content distributor, wherein the at least one metrics for measuring EUVQ is extracted corresponding to each of the plurality of distinct content distributors;
comparing the at least one metrics corresponding to each of the plurality of distinct content distributors to measure a relative EUVQ for the plurality of distinct content distributors.

3. The method according to claim 1 wherein the at least one predefined video is such that a set of features are known for the at least one predefined video, wherein the set of feature comprises a resolution, a frame rate per second and a high frame frequency.

4. The method according to claim 1 wherein the at least one metrics is uploaded to a cloud server.

5. The method according to claim 1 further comprising generating, a quality metric suite for the content distributor by calculating average of the at least one metrics over different videos such that the different videos are played at different times, locations and different network service providers using the data processing module (216).

6. The method according the claim 1 wherein the monitoring station comprises at least one sensor operatively coupled to the at least one playback device such that the sensor receives the trigger and initiates video playback on the at least one playback device.

7. The method according to claim 1 wherein the network is a non-dedicated network comprising internet network and mobile network.

8. The method of claim 1 wherein the metadata comprises, the value of compression before ingestion into the content delivery network (CDN), streaming technology, distance of the edge server that serves the consumer, network capacity and load, last mile connectivity, resources present on the consumption device, resources on the consumption device that is available for the video processing, consumption viewport size.

9. A system (102) for measuring extracting at least one metrics for measuring end user video quality; comprising a processor (202), a memory (204), a monitoring bank (206) and a monitoring station (208) operatively coupled with said processor, the system comprising:
the monitoring bank (206), configured to transmit a trigger to initiate playback for at least one predefined video wherein the at least one predefined video is played over a network;
the monitoring station (208) configured to receive, the trigger wherein receiving the trigger results in playback of the at least one predefined video on at least one playback device such that the at least one predefined video is received from a content distributor;
one or more video capture device (210) configured to capture the at least one predefined video playback, wherein the one or more video capture device (210) is operatively coupled to the monitoring station (208);
a metadata module (212) configured to capture, a metadata associated with each of the at least one predefined video by employing one or more image processing techniques;
a transmission module (214) configured to transmit, to the monitor bank (206), the captured at least one predefined video and the metadata associated with each of the at least one predefined video;
a data processing module (216) configured to process the at least one predefined video and the metadata associated with the each of the at least one predefined video to extract the at least one metrics for measuring EUVQ such that the data processing module (216) is operatively coupled with the monitor bank, wherein the at least metrics are extracted, based on one or more quality parameters comprising a play duration, a total number of frames of the at least one predefined video, a total number of frames rendered, a signal to noise ratio (PSNR), a video startup time, number of frames dropped, number of frozen frames and a duration of video played before a connection drop.

10. The system (102) according to claim 8, further configured to,
extract the at least one metrics for measuring EUVQ for the at least one predefined video such that the at least one predefined video is stored with a plurality of distinct content distributor, wherein the at least one metrics for measuring EUVQ is extracted corresponding to each of the plurality of distinct content distributors;
match the at least one metrics corresponding to each of the plurality of distinct content distributors to measure a relative EUVQ for the plurality of distinct content distributors.

11. The system (102) according to claim 8, wherein the data processing module (210) is further configured to generate a quality metric suite for the content distributor by calculating average of the at least one metrics over different videos such that the different videos are played at different times, locations and different network service providers.

12. The system (102) according to claim 8, wherein the at least one predefined video is such that a set of features are known for the at least one predefined video, wherein the set of feature comprises a resolution, a frame rate per second and a high frame frequency.

13. The system (102) according to claim 8, wherein the monitoring station comprises at least one sensor operatively coupled to the at least one playback device such that the sensor receives the trigger and initiates video playback on the at least one playback device.

14. The system (102) according to claim 8, wherein the network is a non-dedicated network comprising internet network and mobile network.

15. The system (102) according to claim 8 the metadata comprises, the value of compression before ingestion into the content delivery network (CDN), streaming technology, distance of the edge server that serves the consumer, network capacity and load, last mile connectivity, resources present on the consumption device, resources on the consumption device that is available for the video processing, consumption viewport size.
, Description:FORM 2

THE PATENTS ACT, 1970
(39 of 1970)
&
THE PATENT RULES, 2003

COMPLETE SPECIFICATION
(See Section 10 and Rule 13)

Title of invention:
METHOD AND SYSTEM FOR MEASURING END USER VIDEO QUALITY

Applicant:
Tata Consultancy Services Limited
A company Incorporated in India under the Companies Act, 1956
Having address:
Nirmal Building, 9th floor,
Nariman point, Mumbai 400021,
Maharashtra, India

The following specification particularly describes the invention and the manner in which it is to be performed.
FIELD OF THE INVENTION
[001] The present application generally relates to video distribution over a network. Particularly, the application provides a method and system for measuring end user video quality for video consumed over a network.

BACKGROUND OF THE INVENTION
[002] Video consumption over networks that are not dedicated to video distribution has grown tremendously over the past few decades. This phenomenon began with the growth of video consumption over the internet and has further accelerated with the growth being seen in video consumption over mobile networks.

[003] The need for quality measurement and standards for the same is the next step. The video distribution companies have internal measures which measure certain aspects. These are not accepted industry standards and hence the opportunity to establish or influence the evolution of such standards exists.

[004] In video consumption over non-dedicated networks, the quality of the video that the end -user consumes is difficult to measure or specify. This is because of the inherent variation in operating parameters.

[005] From the end user perspective it is seen that the quality of video varies based on various factors including, Video provider, Nature of the video, Compression used, Delivery network, Consumption device, Network load, Consumption geography, Time of the day, State of the consumption device and the like. All this leads to fluctuation in quality even over short durations of time.

[006] A majority of existing solution uses bitrate and resolution as indicative measures of quality. However, both bitrate and resolution are very weak measures of video quality. The current state of the art does not provide for a comprehensive End user video quality metric. Further there is no mechanism present to measure the signal to noise ratio available to the user.

SUMMARY OF THE INVENTION
[007] Before the present methods, systems, and hardware enablement are described, it is to be understood that this invention is not limited to the particular systems, and methodologies described, as there can be multiple possible embodiments of the present invention which are not expressly illustrated in the present disclosure. It is also to be understood that the terminology used in the description is for the purpose of describing the particular versions or embodiments only, and is not intended to limit the scope of the present invention which will be limited only by the appended claims.

[008] The present application provides a method and system for measuring end user video quality (EUVQ).

[009] The present application provides a computer implemented method for measuring end user video quality, wherein said method comprises transmitting, by a monitoring bank (206), a trigger to initiate playback for at least one predefined video wherein the at least one predefined video is stored with a content distributor, and played over a network. The method further comprises receiving, by a monitoring station (208), the trigger. In an aspect of the disclosed subject matter, receiving the trigger results, in playback of the at least one predefined video on at least one playback device such that the at least one predefined video is received from a content distributor. The method further comprises capturing, by one or more video capture device (210) the at least one predefined video playback, wherein the one or more video capture device is operatively coupled to the monitoring station (208). The method further comprises capturing, a metadata associated with each of the at least one predefined video by employing one or more image processing techniques using a metadata module (212). The captured at least one predefined video and corresponding metadata transmitted, to the monitor bank (206), the captured at least one using a transmission module (214). The method further comprises, processing, by a data processing module (216) the at least one predefined video and the metadata associated with the each of the at least one predefined video to extract at least one of metrics for measuring EUVQ such that the data processing module (216) is operatively coupled with the monitor bank. I an embodiment the at least one metrics are extracted, based on one or more quality parameters comprising a play duration, a total number of frames of the at least one predefined video, a total number of frames rendered, a signal to noise ratio (PSNR), a video startup time and a duration of video played before a connection drop.

[0010] In another aspect, the present application provides a system (102) measuring extracting at least one metrics for measuring end user video quality; comprising a processor (202), a memory (204), a monitoring bank (206) and a monitoring station (208) operatively coupled with said processor. In an embodiment the monitoring bank (206) is configured to transmit a trigger to initiate playback for at least one predefined video wherein the at least one predefined video is played over a network. Further the monitoring station (208) is configured to receive, the trigger. In an embodiment, receiving the trigger results in playback of the at least one predefined video on at least one playback device such that the at least one predefined video is received from a content distributor. The system further comprises, one or more video capture device (210) configured to capture the at least one predefined video playback. In an embodiment the one or more video capture device (210) is operatively coupled to the monitoring station (208). Further in another aspect the system comprises a metadata module (212) configured to capture, a metadata associated with each of the at least one predefined video by employing one or more image processing techniques. Further a transmission module (214) is configured to transmit, to the monitor bank (206), the captured at least one predefined video and the metadata associated with each of the at least one predefined video, and a data processing module (216) configured to process the at least one predefined video and the metadata associated with the each of the at least one predefined video to extract the at least one metrics for measuring EUVQ such that the data processing module (216) is operatively coupled with the monitor bank. In an embodiment the at least metrics are extracted, based on one or more quality parameters comprising a play duration, a total number of frames of the at least one predefined video, a total number of frames rendered, a signal to noise ratio (PSNR), a video startup time, number of frames dropped, number of frozen frames and a duration of video played before a connection drop.

BRIEF DESCRIPTION OF THE DRAWINGS
[0011] The foregoing summary, as well as the following detailed description of preferred embodiments, are better understood when read in conjunction with the appended drawings. For the purpose of illustrating the invention, there is shown in the drawings exemplary constructions of the invention; however, the invention is not limited to the specific methods and system disclosed. In the drawings:

[0012] Figure 1: illustrates a network implementation of a system for measuring end user video quality (EUVQ), in accordance with an embodiment of the present subject matter;

[0013] Figure 2: shows block diagrams illustrating the system measuring EUVQ, in accordance with an embodiment of the present subject matter;

[0014] Figure 3: shows a flowchart illustrating the method for extracting at least one metrics for measuring EUVQ;

[0015] Figure 4: shows a flowchart illustrating the method for comparing the at least one metrics for a plurality of distinct content distributor;

[0016] Figure 5: shows a flowchart illustrating an exemplary steps for working of the monitoring bank component of the system; and

[0017] Figure 6: shows a flowchart illustrating an exemplary steps for working of the monitoring station component of the system.

DETAILED DESCRIPTION OF THE INVENTION
[0018] Some embodiments of this invention, illustrating all its features, will now be discussed in detail.

[0019] The words "comprising," "having," "containing," and "including," and other forms thereof, are intended to be equivalent in meaning and be open ended in that an item or items following any one of these words is not meant to be an exhaustive listing of such item or items, or meant to be limited to only the listed item or items.

[0020] It must also be noted that as used herein and in the appended claims, the singular forms "a," "an," and "the" include plural references unless the context clearly dictates otherwise. Although any systems and methods similar or equivalent to those described herein can be used in the practice or testing of embodiments of the present invention, the preferred, systems and methods are now described.

[0021] The disclosed embodiments are merely exemplary of the invention, which may be embodied in various forms.

[0022] The elements illustrated in the Figures inter-operate as explained in more detail below. Before setting forth the detailed explanation, however, it is noted that all of the discussion below, regardless of the particular implementation being described, is exemplary in nature, rather than limiting. For example, although selected aspects, features, or components of the implementations are depicted as being stored in memories, all or part of the systems and methods consistent with the attrition warning system and method may be stored on, distributed across, or read from other machine-readable media.

[0023] The techniques described above may be implemented in one or more computer programs executing on (or executable by) a programmable computer including any combination of any number of the following: a processor, a storage medium readable and/or writable by the processor (including, for example, volatile and non-volatile memory and/or storage elements), plurality of input units, and plurality of output devices. Program code may be applied to input entered using any of the plurality of input units to perform the functions described and to generate an output displayed upon any of the plurality of output devices.

[0024] Each computer program within the scope of the claims below may be implemented in any programming language, such as assembly language, machine language, a high-level procedural programming language, or an object-oriented programming language. The programming language may, for example, be a compiled or interpreted programming language. Each such computer program may be implemented in a computer program product tangibly embodied in a machine-readable storage device for execution by a computer processor.

[0025] Method steps of the invention may be performed by one or more computer processors executing a program tangibly embodied on a computer-readable medium to perform functions of the invention by operating on input and generating output. Suitable processors include, by way of example, both general and special purpose microprocessors. Generally, the processor receives (reads) instructions and data from a memory (such as a read-only memory and/or a random access memory) and writes (stores) instructions and data to the memory. Storage devices suitable for tangibly embodying computer program instructions and data include, for example, all forms of non-volatile memory, such as semiconductor memory devices, including EPROM, EEPROM, and flash memory devices; magnetic disks such as internal hard disks and removable disks; magneto-optical disks; and CD-ROMs. Any of the foregoing may be supplemented by, or incorporated in, specially-designed ASICs (application-specific integrated circuits) or FPGAs (Field-Programmable Gate Arrays). A computer can generally also receive (read) programs and data from, and write (store) programs and data to, a non-transitory computer-readable storage medium such as an internal disk (not shown) or a removable disk.

[0026] Any data disclosed herein may be implemented, for example, in one or more data structures tangibly stored on a non-transitory computer-readable medium. Embodiments of the invention may store such data in such data structure(s) and read such data from such data structure(s).

[0027] The present application provides a computer implemented method and system for measuring end user video quality.

[0028] The present application provides a computer implemented method and system for measuring end user video quality. Referring now to Fig. 1, a network implementation 100 of a system 102 for measuring end user video quality is illustrated, in accordance with an embodiment of the present subject matter. Although the present subject matter is explained considering that the system 102 is implemented on a server, it may be understood that the system 102 may also be implemented in a variety of computing systems, such as a laptop computer, a desktop computer, a notebook, a workstation, a mainframe computer, a server, a network server, and the like. In one implementation, the system 102 may be implemented in a cloud-based environment. In another embodiment, it may be implemented as custom built hardware designed to efficiently perform the invention disclosed. It will be understood that the system 102 may be accessed by multiple users through one or more user devices 104-1, 104-2…104-N, collectively referred to as user devices 104 hereinafter, or applications residing on the user devices 104. Examples of the user devices 104 may include, but are not limited to, a portable computer, a personal digital assistant, a handheld device, and a workstation. The user devices 104 are communicatively coupled to the system 102 through a network 106.

[0029] In one implementation, the network 106 may be a wireless network, a wired network or a combination thereof. The network 106 can be implemented as one of the different types of networks, such as intranet, local area network (LAN), wide area network (WAN), the internet, and the like. The network 106 may either be a dedicated network or a shared network. The shared network represents an association of the different types of networks that use a variety of protocols, for example, Hypertext Transfer Protocol (HTTP), Transmission Control Protocol/Internet Protocol (TCP/IP), Wireless Application Protocol (WAP), and the like, to communicate with one another. Further the network 106 may include a variety of network devices, including routers, bridges, servers, computing devices, storage devices, and the like.

[0030] In one embodiment the present invention, referring to Fig. 2, describes a detailed working of the various components of the system 102.

[0031] A system (102) for measuring extracting at least one metrics for measuring end user video quality; comprising a processor (202), a memory (204), a monitoring bank (206) and a monitoring station (208) operatively coupled with said processor. In an aspect the monitoring bank (206), configured to transmit a trigger to initiate playback for at least one predefined video wherein the at least one predefined video is played over a network.

[0032] In another aspect, the system further comprises the monitoring station (208) configured to receive, the trigger. In an embodiment, the monitoring station comprises at least one sensor operatively coupled to at least one playback device such that the sensor receives the trigger and initiates video playback on the at least one playback device. On receiving the trigger, at least one predefined video is played on at least one playback device such that the at least one predefined video is received from a content distributor. The system (102) further comprises one or more video capture device (210) configured to capture the at least one predefined video playback, wherein the one or more video capture device (210) is operatively coupled to the monitoring station (208).

[0033] Further the system comprises a metadata module (212) configured to capture, a metadata associated with each of the at least one predefined video by employing one or more image processing techniques. The system (102) transmission module (214) configured to transmit, to the monitor bank (206), the captured at least one predefined video and the metadata associated with each of the at least one predefined video. Further the system (102) comprises a data processing module (216) configured to process the at least one predefined video and the metadata associated with the each of the at least one predefined video to extract the at least one metrics for measuring EUVQ.

[0034] In an embodiment the data processing module (216) is operatively coupled with the monitor bank, wherein the at least one metrics are extracted, based on one or more quality parameters comprising a play duration, a total number of frames of the at least one predefined video, a total number of frames rendered, a signal to noise ratio (PSNR), a video startup time and a duration of video played before a connection drop.

[0035] In another embodiment the data processing module (216) comprising generating, a quality metric suite for the content distributor by calculating average of the at least one metrics over different videos such that the different videos are played at different times, locations and different network service providers.

[0036] In one embodiment the at least one metrics may be derived using a quality parameters Q1, Q2, Q3, Q4 and Q5 wherein in Q1, Q2, Q3, Q4 and Q5 represent quality parameters Q1 is calculated using equation (1). Q2 is determined using startup time in seconds, Q3 is determined using Fraction of video played before connection drop, Q4 is computed from the number of dropped frames and Q5 from the number of frozen frames

Q1 = (Content duration/Play duration) * (number of frames rendered/ total number of frames) * (PSNR factor)
where, PSNR factor = 1 if PSNR > 35 dB
= PSNR/35 otherwise
(1).

wherein PSNRe is an PSNR estimate of the PSNR derived after image processing transformation of grabbed video

[0037] EUVQ over the internet is impacted by various factors including, the quality and nature of raw source video, compression during ingestion into the content delivery network (CDN), streaming technology, distance of the edge server that serves the consumer, network capacity and load, last mile connectivity, resources present on the consumption device, resources in the consumption device that is available for the relevant video processing, consumption viewport size, user choices and behavior during consumption.

[0038] Further, the impact of the above mentioned factors is visible in various factors at the end user video consumption end wherein the factors comprise the signal to noise ratio (PSNR) available to the user, frame drops, time to start video play, buffering time, connection drop, correctness of total duration. These factors are used for the extraction of the at least one quality parameter.
[0039] In another embodiment the at least one predefined video is such that a set of features are known for the at least one predefined video. In an aspect the set of feature comprises a resolution, a frame rate per second and a high frame frequency.

[0040] Referring now to Fig. 3 a flow chart illustrating the method for extracting at least one metrics for measuring EUVQ is shown. The process starts at step 302 where a monitoring bank (206) transmits a trigger to a monitoring station (208) to initiate playback for at least one predefined video. In an embodiment the at least one predefined video is stored with a content distributor, and played over a network.

[0041] The values Q1, Q2, Q3, Q4 and Q5 will be averaged over different videos, played at different times, locations and different network service providers to give a quality metric suite for the content distributor. The range over which the averaging is done will be chosen to give other useful information.

[0042] At the step 304 the monitoring station (208) receives the trigger wherein receiving the trigger results, in playback of the at least one predefined video on at least one playback device. In an embodiment, the at least one predefined video is received from a content distributor.

[0043] At the step 306 one or more video capture device (210) capture the at least one predefined video playback. In an embodiment wherein the one or more video capture device is operatively coupled to the monitoring station (208).

[0044] At the step 308, a metadata associated with each of the at least one predefined video is captured by employing one or more image processing techniques using a metadata module (212);

[0045] At the step 310 the captured at least one predefined video and the metadata associated with each of the at least one predefined video is transmitted, to the monitor bank (206), using a transmission module (214);

[0046] At the step 312, the captured at least one predefined video and the metadata associated with the each of the captured at least one predefined video is processed using a data processing module (216). In an embodiment the data processing module (216) is operatively coupled with the monitor bank. In another aspect, the at least one metrics are extracted, based on one or more quality parameters comprising a play duration, a total number of frames of the at least one predefined video, a total number of frames rendered, a signal to noise ratio (PSNR), a video startup time, number of frames dropped, number of frames frozen and a duration of video played before a connection drop.

[0047] Referring to Fig. 4, a flowchart illustrating the method for comparing the at least one metrics for a plurality of distinct content distributor is shown.

[0048] At the step 402 the at least one metrics for measuring EUVQ is extracted, for the at least one predefined video such that the at least one predefined video is stored with a plurality of distinct content distributor, wherein the at least one metrics for measuring EUVQ is extracted corresponding to each of the plurality of distinct content distributors.

[0049] At the step 404 the at least one metrics corresponding to each of the plurality of distinct content distributors is compared with each other.

[0050] Finally at the step 406 a relative EUVQ is measured wherein the relative EUVQ determines the relative video quality based on the at least one metrics, for the plurality of distinct content distributors.

[0051] Fig. 5 and Fig 6 are flowcharts illustrating and exemplary working of the monitor bank and monitoring station respectively. Further the working shown or the terms illustrated are not intended to limit the scope of the disclosed invention.

[0052] As per an exemplary implementation of the disclosed invention, each monitor bank consists of one or more monitoring stations. In an embodiment, the one or more monitoring stations are a mix and match of subsystems representing the combination of user end scenarios against which the monitoring has to happen. Corresponding to each scenario, there may be one or more monitoring station, with each monitoring station in the set being used to monitor a different usage scenario.

[0053] According to an implementation of the disclosed invention, the monitoring stations would access video through different networks that are popular in the location where the monitoring is to be carried out. In another implementation the system may use appropriate sampling techniques to build a composite picture from limited samples.

[0054] According to an aspect of the disclosed invention referring to Fig. 5, the monitor bank would perform the following steps: 1) trigger the monitoring station to play a video from a content distributor 2) receive at least one video captured by the monitoring station and the metadata generated at the station and 3) Process the captured video and metadata to extract the relevant metrics. Also in an embodiment the monitor bank may further upload metrics to a shared global data store. The process may be repeated iteratively and the relevant metrics may be taken as

[0055] As per an exemplary implementation of the disclosed invention as illustrated by Fig. 6, the monitoring station is responsible for playing at least one predefined video from different content providers. The at least one predefined video played is captured by a video capture device operatively coupled with the monitoring station. The captured video is transferred to the monitor bank along with corresponding metadata which is processed by the monitor bank to extract at least one metrics for determining EUVQ.

Documents

Application Documents

# Name Date
1 Form 3 [14-04-2016(online)].pdf 2016-04-14
3 Form 18 [14-04-2016(online)].pdf 2016-04-14
4 Drawing [14-04-2016(online)].pdf 2016-04-14
5 Description(Complete) [14-04-2016(online)].pdf 2016-04-14
6 Form 26 [13-06-2016(online)].pdf 2016-06-13
7 201621013132-POWER OF ATTORNEY-(15-06-2016).pdf 2016-06-15
8 201621013132-CORRESPONDENCE-(15-06-2016).pdf 2016-06-15
9 ABSTRACT1.JPG 2018-08-11
10 201621013132-Form 1-100516.pdf 2018-08-11
11 201621013132-Correspondence-100516.pdf 2018-08-11
12 201621013132-FER.pdf 2020-06-29
13 201621013132-OTHERS [09-12-2020(online)].pdf 2020-12-09
14 201621013132-FER_SER_REPLY [09-12-2020(online)].pdf 2020-12-09
15 201621013132-COMPLETE SPECIFICATION [09-12-2020(online)].pdf 2020-12-09
16 201621013132-CLAIMS [09-12-2020(online)].pdf 2020-12-09
17 201621013132-PatentCertificate21-07-2023.pdf 2023-07-21
18 201621013132-IntimationOfGrant21-07-2023.pdf 2023-07-21

Search Strategy

1 searchE_25-06-2020.pdf

ERegister / Renewals

3rd: 20 Oct 2023

From 14/04/2018 - To 14/04/2019

4th: 20 Oct 2023

From 14/04/2019 - To 14/04/2020

5th: 20 Oct 2023

From 14/04/2020 - To 14/04/2021

6th: 20 Oct 2023

From 14/04/2021 - To 14/04/2022

7th: 20 Oct 2023

From 14/04/2022 - To 14/04/2023

8th: 20 Oct 2023

From 14/04/2023 - To 14/04/2024

9th: 13 Apr 2024

From 14/04/2024 - To 14/04/2025

10th: 18 Mar 2025

From 14/04/2025 - To 14/04/2026