Sign In to Follow Application
View All Documents & Correspondence

Method For Detecting And Alerting A User Condition During Voice Based Detection

Abstract: The present invention describes a method and system (200) for detecting and alerting a user condition during a voice-based detection. The system (200) incorporates the method comprising: receiving a voice based command from said user, detecting presence of a first predetermined voice fragment and a second predetermined voice fragment in the voice-based command, and triggering output of a false information pertaining to an operation underway.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
01 September 2015
Publication Number
39/2015
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
mail@lexorbis.com
Parent Application
Patent Number
Legal Status
Grant Date
2023-02-24
Renewal Date

Applicants

Comviva Technologies Limited
A-26, Info City, Sector 34, Gurgaon-122001, Haryana, India

Inventors

1. BANERJEE, Amrita
1086, Sobha Moonstone, Jakkur Main Road, Dasarahalli, Byatrayanapura, Bangalore – 5600024, India

Specification

FIELD OF THE INVENTION:

The present invention relates to authentication systems and in particular relates to voice based identification mechanism in the authentication systems.

BACKGROUND OF THE INVENTION:

Authenticating a user prior to processing a requested service has assumed great significance and various forms with technological evolution. In some cases it may be a single stage authentication, like a single stage challenge response mechanism, while in other cases multi-stage authentication may be prevalent. To substitute or supplement existing authentication mechanisms, biometric-identification systems have also evolved. Biometric identification systems are known to exist in various forms such as voice, retinal scan, fingerprint etc based detection systems.

Worldwide, there have been growing instances of crimes such as holding people to ransom or at gun-point for forcefully demanding services, goods, money etc. More often, such instances have been reported at the premises of cash dispensing ATMs, where the person can be waylaid by a perpetrator and terrorized to forcefully withdraw cash or reveal the account details. Under such circumstances, the victim has to forcefully authenticate himself and cash dispensers have no other option rather than honouring the request. Moreover, as the perpetrators are generally aware of a sequence of operations that leads to dispensation of goods and service, the victim trying to intentionally use wrong commands to deliberately suppress an outcome is prone to be detected by the perpetrators. For example, intentional inputting of wrong password as an action is easily identifiable due to stereotype responses from the ATMs and may provoke the perpetrator to further terrorize the victim.

Accordingly, there is a long felt need of a mechanism in authentication systems that although authenticates the user upon receiving valid credentials, but deliberately suppresses a process to be otherwise performed after authentication, and yet evades detection of such suppression.


SUMMARY OF THE INVENTION:
In an embodiment, the present invention describes a method for detecting and alerting a user condition during a voice-based detection. The method comprises receiving a voice based command from said user, detecting presence of a first predetermined voice fragment and a second predetermined voice fragment in the voice-based command, and triggering output of a false information pertaining to an operation underway.
In other embodiment, the present invention also provides a system for detecting and alerting a user condition during a voice-based detection. The system comprises a receiver for receiving a voice based command from said user, a detector for detecting presence of a first predetermined voice fragment and a second predetermined voice fragment in the voice-based command; and a processor for triggering output of a false information pertaining to an operation underway.
To further clarify advantages and features of the present invention, a more particular description of the invention will be rendered by reference to specific embodiments thereof, which is illustrated in the appended drawings. It is appreciated that these drawings depict only typical embodiments of the invention and are therefore not to be considered limiting of its scope. The invention will be described and explained with additional specificity and detail with the accompanying drawings.

BRIEF DESCRIPTION OF FIGURES:

These and other features, aspects, and advantages of the present invention will become better understood when the following detailed description is read with reference to the accompanying drawings in which like characters represent like parts throughout the drawings, wherein:

Figure 1 shows a flow chart corresponding to a first embodiment of the invention;
Figure 2 shows a detailed internal construction of the apparatus in accordance with a first embodiment of the present invention;
Figure 3 shows a detailed internal construction of the apparatus as described in Fig. 2.

Further, skilled artisans will appreciate that elements in the drawings are illustrated for simplicity and may not have been necessarily been drawn to scale. For example, the flow charts illustrate the method in terms of the most prominent steps involved to help to improve understanding of aspects of the present invention. Furthermore, in terms of the construction of the device, one or more components of the device may have been represented in the drawings by conventional symbols, and the drawings may show only those specific details that are pertinent to understanding the embodiments of the present invention so as not to obscure the drawings with details that will be readily apparent to those of ordinary skill in the art having benefit of the description herein.

DETAILED DESCRIPTION:

For the purpose of promoting an understanding of the principles of the invention, reference will now be made to the embodiment illustrated in the drawings and specific language will be used to describe the same. It will nevertheless be understood that no limitation of the scope of the invention is thereby intended, such alterations and further modifications in the illustrated system, and such further applications of the principles of the invention as illustrated therein being contemplated as would normally occur to one skilled in the art to which the invention relates.

It will be understood by those skilled in the art that the foregoing general description and the following detailed description are exemplary and explanatory of the invention and are not intended to be restrictive thereof.

Reference throughout this specification to “an aspect”, “another aspect” or similar language means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the present invention. Thus, appearances of the phrase “in an embodiment”, “in another embodiment” and similar language throughout this specification may, but do not necessarily, all refer to the same embodiment.

The terms "comprises", "comprising", or any other variations thereof, are intended to cover a non-exclusive inclusion, such that a process or method that comprises a list of steps does not include only those steps but may include other steps not expressly listed or inherent to such process or method. Similarly, one or more devices or sub-systems or elements or structures or components proceeded by "comprises... a" does not, without more constraints, preclude the existence of other devices or other sub-systems or other elements or other structures or other components or additional devices or additional sub-systems or additional elements or additional structures or additional components.

Unless otherwise defined, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. The system, methods, and examples provided herein are illustrative only and not intended to be limiting.

Embodiments of the present invention will be described below in detail with reference to the accompanying drawings.

Now referring to Figure 1, it can be seen that the present invention provides a method for detecting and alerting a user condition during a voice-based detection, said method comprising:
receiving (step 102) a voice based command from said user;
detecting (step 104) presence of a first predetermined voice fragment and a second predetermined voice fragment in the voice-based command; and
triggering (step 106) output of a false information pertaining to an operation underway.

In another embodiment, wherein prior to said receiving, authenticating said user through a challenge-response mechanism or a PIN based authentication system.

In yet another embodiment, wherein each of said voice fragment is at least one of a word, phrase or a sentence spoken by a living or a non-living being.

In yet another embodiment, wherein said voice command comprises said pre-determined fragments arranged in a pre-defined order.

In yet another embodiment, wherein in case said detecting comprises detecting absence of the second predetermined voice fragment in the voice-based command, then said triggering comprises triggering output of an appropriate information pertaining to an operation underway.

In yet another embodiment, wherein said voice command comprises said pre-determined fragments arranged in a pre-defined order, said detecting comprises:
identifying within said received voice-gesture, said voice fragments based on a pre-defined criteria;
optionally identifying an order of said fragments within the gesture; and
classifying, based on said identification, said command as having a combination of a regular valid input and a non-regular valid input.

In yet another embodiment, wherein said operation relates to an event associated with a user account.

In yet another embodiment, wherein said outputting of false information denotes communicating a false success or a false progress of the operation underway, said communication being either a display or a sound alert.

In yet another embodiment, further comprising predicting a compulsive provision of said voice command by said user under the influence of an external threat.

In yet another embodiment, the present invention further comprises communicating a law enforcement authority within a pre-determined area to communicate said prediction.

Referring to Figure 2, the present invention also provides a system (200) for detecting and alerting a user condition during a voice-based detection, said system (200) comprising:
a receiver (202) for receiving a voice based command from said user;
a detector (204) for detecting presence of a first predetermined voice fragment and a second predetermined voice fragment in the voice-based command; and
a processor (206) for triggering output of a false information pertaining to an operation underway.

Referring to figure 3, yet another typical hardware configuration of the system 200 in the form of a computer system 300 is shown. The computer system 300 can include a set of instructions that can be executed to cause the computer system 300 to perform any one or more of the methods disclosed. The computer system 300 may operate as a standalone device or may be connected, e.g., using a network, to other computer systems or peripheral devices.

In a networked deployment, the computer system 300 may operate in the capacity of a server or as a client user computer in a server-client user network environment, or as a peer computer system in a peer-to-peer (or distributed) network environment. The computer system 300 can also be implemented as or incorporated across various devices, such as a personal computer (PC), a tablet PC, a personal digital assistant (PDA), a mobile device, a palmtop computer, a laptop computer, a desktop computer, a communications device, a wireless telephone, a land-line telephone, a web appliance, a network router, switch or bridge, or any other machine capable of executing a set of instructions (sequential or otherwise) that specify actions to be taken by that machine. Further, while a single computer system 300 is illustrated, the term "system" shall also be taken to include any collection of systems or sub-systems that individually or jointly execute a set, or multiple sets, of instructions to perform one or more computer functions.

The computer system 300 may include a processor 302 e.g., a central processing unit (CPU), a graphics processing unit (GPU), or both. The processor 302 may be a component in a variety of systems. For example, the processor 302 may be part of a standard personal computer or a workstation. The processor 302 may be one or more general processors, digital signal processors, application specific integrated circuits, field programmable gate arrays, servers, networks, digital circuits, analog circuits, combinations thereof, or other now known or later developed devices for analysing and processing data The processor 302 may implement a software program, such as code generated manually (i.e., programmed).

The computer system 300 may include a memory 304, such as a memory 304 that can communicate via a bus 308. The memory 304 may be a main memory, a static memory, or a dynamic memory. The memory 304 may include, but is not limited to computer readable storage media such as various types of volatile and non-volatile storage media, including but not limited to random access memory, read-only memory, programmable read-only memory, electrically programmable read-only memory, electrically erasable read-only memory, flash memory, magnetic tape or disk, optical media and the like. In one example, the memory 304 includes a cache or random access memory for the processor 302. In alternative examples, the memory 304 is separate from the processor 302, such as a cache memory of a processor, the system memory, or other memory. The memory 304 may be an external storage device or database for storing data. Examples include a hard drive, compact disc ("CD"), digital video disc ("DVD"), memory card, memory stick, floppy disc, universal serial bus ("USB") memory device, or any other device operative to store data. The memory 304 is operable to store instructions executable by the processor 302. The functions, acts or tasks illustrated in the figures or described may be performed by the programmed processor 302 executing the instructions stored in the memory 304. The functions, acts or tasks are independent of the particular type of instructions set, storage media, processor or processing strategy and may be performed by software, hardware, integrated circuits, firm-ware, micro-code and the like, operating alone or in combination. Likewise, processing strategies may include multiprocessing, multitasking, parallel processing and the like.

As shown, the computer system 300 may or may not further include a display unit 310, such as a liquid crystal display (LCD), an organic light emitting diode (OLED), a flat panel display, a solid state display, a cathode ray tube (CRT), a projector, a printer or other now known or later developed display device for outputting determined information. The display 310 may act as an interface for the user to see the functioning of the processor 302, or specifically as an interface with the software stored in the memory 304 or in the drive unit 316.

Additionally, the computer system 300 may include an input device 312 configured to allow a user to interact with any of the components of system 300. The input device 312 may be a number pad, a keyboard, or a cursor control device, such as a mouse, or a joystick, touch screen display, remote control or any other device operative to interact with the computer system 300.

The computer system 300 may also include a disk or optical drive unit 316. The disk drive unit 316 may include a computer-readable medium 322 in which one or more sets of instructions 324, e.g. software, can be embedded. Further, the instructions 324 may embody one or more of the methods or logic as described. In a particular example, the instructions 324 may reside completely, or at least partially, within the memory 304 or within the processor 302 during execution by the computer system 300. The memory 304 and the processor 302 also may include computer-readable media as discussed above.

The present invention contemplates a computer-readable medium that includes instructions 324 or receives and executes instructions 324 responsive to a propagated signal so that a device connected to a network 326 can communicate voice, video, audio, images or any other data over the network 326. Further, the instructions 324 may be transmitted or received over the network 326 via a communication port or interface 320 or using a bus 308. The communication port or interface 320 may be a part of the processor 302 or may be a separate component. The communication port 320 may be created in software or may be a physical connection in hardware. The communication port 320 may be configured to connect with a network 326, external media, the display 310, or any other components in system 300, or combinations thereof. The connection with the network 326 may be a physical connection, such as a wired Ethernet connection or may be established wirelessly as discussed later. Likewise, the additional connections with other components of the system 300 may be physical connections or may be established wirelessly. The network 326 may alternatively be directly connected to the bus 308.

The network 326 may include wired networks, wireless networks, Ethernet AVB networks, or combinations thereof. The wireless network may be a cellular telephone network, an 802.11, 802.16, 802.20, 802.1Q or WiMax network. Further, the network 326 may be a public network, such as the Internet, a private network, such as an intranet, or combinations thereof, and may utilize a variety of networking protocols now available or later developed including, but not limited to TCP/IP based networking protocols.

In an alternative example, dedicated hardware implementations, such as application specific integrated circuits, programmable logic arrays and other hardware devices, can be constructed to implement various parts of the system 300.

Applications that may include the systems can broadly include a variety of electronic and computer systems. One or more examples described may implement functions using two or more specific interconnected hardware modules or devices with related control and data signals that can be communicated between and through the modules, or as portions of an application-specific integrated circuit. Accordingly, the present system encompasses software, firmware, and hardware implementations.

The system described may be implemented by software programs executable by a computer system. Further, in a non-limited example, implementations can include distributed processing, component/object distributed processing, and parallel processing. Alternatively, virtual computer system processing can be constructed to implement various parts of the system.

The system is not limited to operation with any particular standards and protocols. For example, standards for Internet and other packet switched network transmission (e.g., TCP/IP, UDP/IP, HTML, HTTP) may be used. Such standards are periodically superseded by faster or more efficient equivalents having essentially the same functions. Accordingly, replacement standards and protocols having the same or similar functions as those disclosed are considered equivalents thereof.

In the following paragraphs, a detailed description about exemplary implementation of the invention and a control flow within such exemplary implementation has been provided. It should however, be understood that there may be other analogous implementations related to claimed method and apparatus that need not strictly follow the components, steps and sequence of steps as described in the following paragraphs. Thus, the following explanation shall be strictly interpreted as one of the many conceivable exemplary implementations of the claimed subject matter, and the scope of the claims is intended to be restricted only on the basis of the claim language and its equivalents.

EXEMPLARY IMPLEMENTATION

The present invention as described in aforesaid embodiments may be incorporated as a security mechanism in almost every utility that incorporate voice-recognition based authentication or a combination of challenge–response authentication (or PIN based) or any other authentication mechanism with voice the recognition based detection system. Another requirement for the incorporation of present invention in a utility is that the utility shall include at least one of an graphical or audio-visual user interface that receives input from the user and thereafter sounds or displays a sequential occurrence of events after a successful authentication.

A very close example of aforesaid utility may be an automated teller machine (ATM) that accepts voice-commands for user identification and thereafter displays or sounds the occurrence of requested operation, which may be a cash-withdrawal request, cash enquiry request, PIN change request, cash deposit request etc. Likewise, the present invention may also find application in various multi-stage authentication based access systems associated with a protected entity (for example bank-locker). Such access systems may also require voice based password for identifying a user and thereafter display or sound occurrence of an event (for example displaying the notice “Granting access….…..please have patience”), thereby finally leading to grant or denial of an access.

Now prior to discussing an exemplary operation of the invention, it may be recollected that a primary objective of the present invention is to sense if the user is under stress during authentication stages, and accordingly subvert an otherwise intended operation upon having sensed such situation, while simultaneously portraying as if the operation is proceeding in right direction. Moreover, the present invention is applicable only in systems that are configured to accept voice commands to identify or authenticate a user.

In an exemplary operation, a user who is under threat or has been waylaid in an ATM premises provides a voice command to the ATM prove his identity. In order to address the demand as posed by the perpetrator, the perturbed user under such circumstance may provide a command that is a combination of a first pre-defined identifier (i.e. voice PIN or password) and another pre-defined identifier that acts a threat indicator. In an example, the voice password and the threat indicator act as voice fragments of the same voice command and may be spoken by the user with minimum possible “in-between” pause or gap. Each of said voice fragment may be a word, phrase or a sentence and both fragments may be arranged in a pre-defined order within the voice instruction, .e.g. predefined voice identifier succeeding the voice PIN. For example, in a voice based instruction, ‘Make Payment’ may be spoken as a composite voice based instruction, wherein “make” may be a threat indicator, while “payment” may be the ”voice based” password.

The present invention’s exemplary system as implemented within the automated dispenser machine detects presence of the first predetermined voice fragment, i.e. the voice based PIN/password and a second predetermined voice fragment i.e. the threat indicator in the voice-based command. Specifically, the detecting comprises identifying the voice fragments based on comparison with pre-stored identifiers. In addition, an order of presence of said voice fragments in the voice commend is also identified to establish the identity and stressful condition of the user. For example, the threat indicator may be configured to either precede or succeed the voice based password.

Based on said aforesaid exemplary identification, the voice based instruction provided by the user may be declared as being composed of a regular valid input (i.e. password) and a non-regular valid input (i.e. threat indicator), provided that both voice fragments as provided from the user have been verified.

Thereafter, based on the identification of the voice instruction as being a composition of regular and irregular valid inputs, a display of false information pertaining to an operation is triggered, wherein such operation relates to the user account (i.e. bank account) and is otherwise performed when the voice instructions only include the regular valid input. The operation may pertain to cash withdrawal, account balance checking, cash deposit etc. The outputting of false information denotes communicating (i.e. displaying or sounding) a false success or a false progress of the operation, said communication being either through a display or a sound alert or a combination of both. In other words, the while the operation performance is subverted, a factious display or audible alert may denote as if the operation is progressing usually.

However, in case there is an absence of the threat indicator and only the voice based password is present, then a normal sequence of events pertaining to an operation is displayed to depict the operation. Accordingly, a final dispensation of cash is also resulted. As may be understood, such a scenario is expected to happen when the user is acting as per his own will and without any threat.

Now going back to the portrayal of false information by the present system, in an example, the system indicates a pseudo ‘Transaction Success’ message so as to provide a false impression of the ‘transaction being executed’ to the perpetrators. As a part of such false display of information, the system displays a sequence of steps that are observed in normal circumstances. Either the display may proceed at a slow pace and may repetitively depict the same steps in a particular order, thereby leading to a substantial elapse of time and yet keeping perpetrator’s hopes of a successful transaction alive.

In other examples, as a part of displaying a false progress, a ‘cash withdrawal’ transaction is shown to be initiated and the cash dispensation is terminated just at the verge of cash dispensation. At this juncture, various types of fictious yet realistic messages may be displayed to depict the rationale behind termination, say ‘Server is down. Please try after sometime’,‘Insufficient funds’ etc. Overall, the progress of operation and its eventual termination may be timed so that there is a substantial elapse of time (say at least of 5 minutes) since the receipt of the request for transaction and the display of the fictious messages.

The significance behind the elapse of time in both of the aforementioned scenarios is to allow the law enforcement authorities such as police, bomb disposal squad, forensic experts, etc in reaching the crime-spot, preventing any untoward incident, and apprehending the perpetrator, before the perpetrator realizes that the transaction will never materialize! The alert to police or any other authority located in the vicinity may be provided through radio communication or any other form of wireless communication. As it may be understood, such alert may be communicated as soon as the presence of the aforesaid two fragments in the voice command has been identified.

While specific language has been used to describe the disclosure, any limitations arising on account of the same are not intended. As would be apparent to a person in the art, various working modifications may be made to the method in order to implement the inventive concept as taught herein.

The drawings and the forgoing description give examples of embodiments. Those skilled in the art will appreciate that one or more of the described elements may well be combined into a single functional element. Alternatively, certain elements may be split into multiple functional elements. Elements from one embodiment may be added to another embodiment. For example, orders of processes described herein may be changed and are not limited to the manner described herein.

Moreover, the actions of any flow diagram need not be implemented in the order shown; nor do all of the acts necessarily need to be performed. Also, those acts that are not dependent on other acts may be performed in parallel with the other acts. The scope of embodiments is by no means limited by these specific examples. Numerous variations, whether explicitly given in the specification or not, such as differences in structure, dimension, and use of material, are possible. The scope of embodiments is at least as broad as given by the following claims.

Benefits, other advantages, and solutions to problems have been described above with regard to specific embodiments. However, the benefits, advantages, solutions to problems, and any component(s) that may cause any benefit, advantage, or solution to occur or become more pronounced are not to be construed as a critical, required, or essential feature or component of any or all the claims.

Claims:WE CLAIM:

1. A method for detecting and alerting a user condition during a voice-based detection, said method comprising:
receiving (step 102) a voice based command from said user;
detecting (step 104) presence of a first predetermined voice fragment and a second predetermined voice fragment in the voice-based command; and
triggering (step 106) output of a false information pertaining to an operation underway.

2. The method as claim in claim 1, wherein prior to said receiving, authenticating said user through a challenge-response mechanism or a PIN based authentication system.

3. The method as claimed in claim 1, wherein each of said voice fragment is at least one of a word, phrase or a sentence spoken by a living or a non-living being.

4. The method as claimed in claim 1, wherein said voice command comprises said pre-determined fragments arranged in a pre-defined order.

5. The method as claimed in claim 1, wherein in case said detecting comprises detecting absence of the second predetermined voice fragment in the voice-based command, then said triggering comprises triggering output of an appropriate information pertaining to an operation underway.

6. The method as claimed in claim 1, said detecting comprises:
identifying within said received voice-gesture, said voice fragments based on a pre-defined criteria;
optionally identifying an order of said fragments within the gesture; and
classifying, based on said identification, said command as having a combination of a regular valid input and a non-regular valid input.

7. The method as claimed in claim 1, wherein said operation relates to an event associated with a user account.

8. The method as claimed in claim 1, wherein said outputting of false information denotes communicating a false success or a false progress of the operation underway, said communication being either a display or a sound alert.

9. The method as claimed in claim 1, further comprising predicting a compulsive provision of said voice command by said user under the influence of an external threat.

10. The method as claimed in claim 9, further comprising:
communicating a law enforcement authority within a pre-determined area to communicate said prediction.

11. A system (200) for detecting and alerting a user condition during a voice-based detection, said method comprising:
a receiver (202) for receiving a voice based command from said user;
a detector (204) for detecting presence of a first predetermined voice fragment and a second predetermined voice fragment in the voice-based command; and
a processor (206) for triggering output of a false information pertaining to an operation underway.

Documents

Application Documents

# Name Date
1 2746-DEL-2015-IntimationOfGrant24-02-2023.pdf 2023-02-24
1 Power of Attorney [01-09-2015(online)].pdf 2015-09-01
2 Form 9 [01-09-2015(online)].pdf 2015-09-01
2 2746-DEL-2015-PatentCertificate24-02-2023.pdf 2023-02-24
3 Form 5 [01-09-2015(online)].pdf 2015-09-01
3 2746-DEL-2015-FORM-8 [10-01-2023(online)].pdf 2023-01-10
4 Form 3 [01-09-2015(online)].pdf 2015-09-01
4 2746-DEL-2015-Written submissions and relevant documents [10-01-2023(online)].pdf 2023-01-10
5 Form 18 [01-09-2015(online)].pdf 2015-09-01
5 2746-DEL-2015-FORM-26 [28-12-2022(online)].pdf 2022-12-28
6 Drawing [01-09-2015(online)].pdf 2015-09-01
6 2746-DEL-2015-Correspondence to notify the Controller [27-12-2022(online)].pdf 2022-12-27
7 Description(Complete) [01-09-2015(online)].pdf 2015-09-01
7 2746-DEL-2015-US(14)-HearingNotice-(HearingDate-29-12-2022).pdf 2022-12-13
8 2746-del-2015-Form-1-(29-09-2015).pdf 2015-09-29
8 2746-DEL-2015-CLAIMS [14-06-2021(online)].pdf 2021-06-14
9 2746-del-2015-Correspondence Others-(29-09-2015).pdf 2015-09-29
9 2746-DEL-2015-COMPLETE SPECIFICATION [14-06-2021(online)].pdf 2021-06-14
10 2746-DEL-2015-FER_SER_REPLY [14-06-2021(online)].pdf 2021-06-14
10 2746-del-2015-GPA-(10-03-2016).pdf 2016-03-10
11 2746-del-2015-Correspondence Others-(10-03-2016).pdf 2016-03-10
11 2746-DEL-2015-OTHERS [14-06-2021(online)].pdf 2021-06-14
12 2746-DEL-2015-FER.pdf 2020-01-29
13 2746-del-2015-Correspondence Others-(10-03-2016).pdf 2016-03-10
13 2746-DEL-2015-OTHERS [14-06-2021(online)].pdf 2021-06-14
14 2746-DEL-2015-FER_SER_REPLY [14-06-2021(online)].pdf 2021-06-14
14 2746-del-2015-GPA-(10-03-2016).pdf 2016-03-10
15 2746-DEL-2015-COMPLETE SPECIFICATION [14-06-2021(online)].pdf 2021-06-14
15 2746-del-2015-Correspondence Others-(29-09-2015).pdf 2015-09-29
16 2746-DEL-2015-CLAIMS [14-06-2021(online)].pdf 2021-06-14
16 2746-del-2015-Form-1-(29-09-2015).pdf 2015-09-29
17 2746-DEL-2015-US(14)-HearingNotice-(HearingDate-29-12-2022).pdf 2022-12-13
17 Description(Complete) [01-09-2015(online)].pdf 2015-09-01
18 2746-DEL-2015-Correspondence to notify the Controller [27-12-2022(online)].pdf 2022-12-27
18 Drawing [01-09-2015(online)].pdf 2015-09-01
19 2746-DEL-2015-FORM-26 [28-12-2022(online)].pdf 2022-12-28
19 Form 18 [01-09-2015(online)].pdf 2015-09-01
20 Form 3 [01-09-2015(online)].pdf 2015-09-01
20 2746-DEL-2015-Written submissions and relevant documents [10-01-2023(online)].pdf 2023-01-10
21 Form 5 [01-09-2015(online)].pdf 2015-09-01
21 2746-DEL-2015-FORM-8 [10-01-2023(online)].pdf 2023-01-10
22 Form 9 [01-09-2015(online)].pdf 2015-09-01
22 2746-DEL-2015-PatentCertificate24-02-2023.pdf 2023-02-24
23 Power of Attorney [01-09-2015(online)].pdf 2015-09-01
23 2746-DEL-2015-IntimationOfGrant24-02-2023.pdf 2023-02-24

Search Strategy

1 Searchstrategy_27-01-2020.pdf

ERegister / Renewals

3rd: 10 Mar 2023

From 01/09/2017 - To 01/09/2018

4th: 10 Mar 2023

From 01/09/2018 - To 01/09/2019

5th: 10 Mar 2023

From 01/09/2019 - To 01/09/2020

6th: 10 Mar 2023

From 01/09/2020 - To 01/09/2021

7th: 10 Mar 2023

From 01/09/2021 - To 01/09/2022

8th: 10 Mar 2023

From 01/09/2022 - To 01/09/2023

9th: 10 Mar 2023

From 01/09/2023 - To 01/09/2024

10th: 13 Mar 2024

From 01/09/2024 - To 01/09/2025

11th: 01 Sep 2025

From 01/09/2025 - To 01/09/2026