Sign In to Follow Application
View All Documents & Correspondence

Method And An Electronic Device For User Interaction With The Electronic Device

Abstract: ABSTRACT A method and an electronic device for user interaction with an electronic device are described. The method comprises identifying at least one gesture detected by a sensor unit, wherein the sensor unit comprises a camera sensor and a Heart Rate Monitor (HRM) sensor of the electronic device. Further, the method comprises executing at least one task corresponding to an application in accordance with the at least one gesture so identified. FIG. 2

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
26 February 2016
Publication Number
45/2017
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
Parent Application
Patent Number
Legal Status
Grant Date
2023-08-24
Renewal Date

Applicants

SAMSUNG R&D Institute India - Bangalore Private Limited
#2870, Phoenix Building, Bagmane Constellation Business Park, Outer Ring Road, Doddanekundi Circle, Marathahalli Post, Bengaluru, Karnataka 560037

Inventors

1. Gandhi Gurunathan Rajendran
#2870, Phoenix Building, Bagmane Constellation Business Park, Outer Ring Road, Doddanekundi Circle, Marathahalli Post, Bengaluru, Karnataka 560037
2. Ilavarasu Jayabalan
#2870, Phoenix Building, Bagmane Constellation Business Park, Outer Ring Road, Doddanekundi Circle, Marathahalli Post, Bengaluru, Karnataka 560037
3. Raghavendra Pai
#2870, Phoenix Building, Bagmane Constellation Business Park, Outer Ring Road, Doddanekundi Circle, Marathahalli Post, Bengaluru, Karnataka 560037
4. Vivek Chamarajanagar Dwarakanath
#2870, Phoenix Building, Bagmane Constellation Business Park, Outer Ring Road, Doddanekundi Circle, Marathahalli Post, Bengaluru, Karnataka 560037
5. Samudrala Nagaraju
#2870, Phoenix Building, Bagmane Constellation Business Park, Outer Ring Road, Doddanekundi Circle, Marathahalli Post, Bengaluru, Karnataka 560037

Specification

Claims:STATEMENT OF CLAIMS
We claim:
1. A method for user interaction with an electronic device, wherein the method comprises:
identifying, by a device operation module in the electronic device, at least one gesture detected by a sensor unit, wherein the sensor unit comprises a camera sensor and a Heart Rate Monitor (HRM) sensor of the electronic device; and
executing, by the device operation module, at least one task corresponding to an application in accordance with the at least one gesture so identified.
2. The method as claimed in claim 1, wherein the at least one gesture comprises one of a single gesture on the camera sensor, a plurality of gestures on the camera sensor, a single gesture on the HRM sensor, a plurality of gestures on the HRM sensor, and a combination of at least one gesture on the HRM sensor and at least one gesture on the camera sensor.
3. An electronic device for user interaction with the electronic device, wherein the electronic device comprises a device operation module configured to:
identify at least one gesture detected by a sensor unit, wherein the sensor unit comprises a camera sensor and a Heart Rate Monitor (HRM) sensor of the electronic device; and
execute at least one task corresponding to an application in accordance with the at least one gesture so identified.
4. The electronic device as claimed in claim 3, wherein the at least one gesture comprises one of a single gesture on the camera sensor, a plurality of gestures on the camera sensor, a single gesture on the HRM sensor, a plurality of gestures on the HRM sensor, and a combination of at least one gesture on the HRM sensor and at least one gesture on the camera sensor.

Dated this 26th February, 2016
Signature:
Name of the Signatory: Dr. Kalyan Chakravarthy
, Description:The following specification particularly describes and ascertains the nature of this invention and the manner in which it is to be performed:-

TECHNICAL FIELD
[001] The embodiments herein generally relate to the field of hand held electronic devices and more particularly to enhancing user experience while interacting with the electronic device.

BACKGROUND
[002] Rapid development in device technology of electronic devices such as smart phones, tablets and so on has enabled adding more functionality into smaller devices. This has enabled a user to comfortably watch media, browse web, read books on an electronic device. With electronic device used for these tasks, preferred display screen size for the electronic device is shifting from smaller sizes to larger sizes. However, as screen size becomes larger and larger, operating or interacting with the electronic device becomes difficult and the whole purpose of handheld device being comfortably used, while multitasking, may be defeated. Existing solutions provides shrink screen option that scale down content displayed on the screen, so that application icons on the screen are brought within reach of a hand. The desired tasks may then be performed easily with hand gestures. However, with the screen size scaled down, the screen view can be cluttered. In another existing method, the content displayed on the screen scrolls down so that icons, initially out of reach, are within the reach of fingers of the hand. However, to perform a particular task, a user need to browse through the applications icons, select the application and then use the application for performing the task. For quick launch of the application, existing methods provide gesture based application launch. However, the existing hand gesture options available for large screen are limited which restricts further handling the launched application.

OBJECTS
[003] The principal object of the embodiments herein is to provide a method and an apparatus (electronic device) for enabling a user to interact with the electronic device using a sensor unit, wherein the sensor unit includes a camera sensor and a Heart Rate Monitor (HRM) sensor.
[004] Another object of the invention is to execute one or more tasks among plurality of tasks corresponding to an application in accordance with at least one gesture detected by at least one of a camera sensor and a Heart Rate Monitor (HRM).

SUMMARY
[005] In view of the foregoing, an embodiment herein provides a method for user interaction with an electronic device. The method comprises identifying at least one gesture detected by a sensor unit, wherein the sensor unit comprises a camera sensor and a Heart Rate Monitor (HRM) sensor of the electronic device. Further, the method comprises executing at least one task corresponding to an application in accordance with the at least one gesture so identified.
[006] Embodiments further disclose an electronic device for user interaction with the electronic device. The electronic device comprises a device operation module configured to identify at least one gesture detected by a sensor unit, wherein the sensor unit comprises a camera sensor and a Heart Rate Monitor (HRM) sensor of the electronic device. Further, the device operation module is configured to execute at least one task corresponding to an application in accordance with the at least one gesture so identified.
[007] These and other aspects of the embodiments herein will be better appreciated and understood when considered in conjunction with the following description and the accompanying drawings. It should be understood, however, that the following descriptions, while indicating embodiments and numerous specific details thereof, are given by way of illustration and not of limitation. Many changes and modifications may be made within the scope of the embodiments herein without departing from the spirit thereof, and the embodiments herein include all such modifications.


BRIEF DESCRIPTION OF FIGURES
[008] The embodiments of this invention are illustrated in the accompanying drawings, throughout which like reference letters indicate corresponding parts in the various figures. The embodiments herein will be better understood from the following description with reference to the drawings, in which:
[009] FIG. 1 illustrates a plurality of components of an electronic device for user interaction with the electronic device through a sensor unit that includes a camera sensor and a Heart rate Monitor (HRM) sensor , according to embodiments as disclosed herein;
[0010] FIG. 2 is a flow diagram illustrating a method for user interaction with the electronic device through the sensor unit that includes the camera sensor and the HRM sensor, according to embodiments as disclosed herein;
[0011] FIG. 3 is an example illustrating one or more tasks of an application, executed based on one or more gestures detected by the camera sensor and the HRM sensor, according to embodiments as disclosed herein;
[0012] FIG. 4 is an example illustrating one or more tasks of an application, executed based on one or more gestures detected by the camera sensor and the HRM sensor, according to embodiments as disclosed herein;
[0013] FIG. 5a and FIG. 5b are example tasks executed, based on one or more gestures detected by the camera sensor and the HRM sensor, according to embodiments as disclosed herein;
[0014] FIG. 6 is an example illustrating one or more tasks of an application, executed based on one or more gestures detected by the camera sensor and the HRM sensor, according to embodiments as disclosed herein; and
[0015] FIG. 7 illustrates a computing environment implementing the method for user interaction with the electronic device through a sensor unit, as disclosed in the embodiments herein.

DETAILED DESCRIPTION
[0016] The embodiments herein and the various features and advantageous details thereof are explained more fully with reference to the non-limiting embodiments that are illustrated in the accompanying drawings and detailed in the following description. Descriptions of well-known components and processing techniques are omitted so as to not unnecessarily obscure the embodiments herein. The examples used herein are intended merely to facilitate an understanding of ways in which the embodiments herein may be practiced and to further enable those of skill in the art to practice the embodiments herein. Accordingly, the examples should not be construed as limiting the scope of the embodiments herein.
[0017] The embodiments herein achieve a method and an electronic device enabling a user to interact with the electronic device through a sensor unit, wherein the sensor unit includes a camera sensor and a Heart Rate Monitor (HRM) sensor. . The method includes executing a task among plurality of tasks in accordance with one or more gestures detected by the camera sensor and the Heart Rate Monitor (HRM) sensor of the electronic device. In case, wherever necessary, one or more gestures can be easily performed using single hand gestures. Unique combination of one or more gestures on the camera sensor and/or the HRM sensor enables performing a plurality of tasks corresponding to an application on the electronic device. Thus, the combination of the camera sensor and the HRM sensor as the sensor unit enables executing multitude of tasks using hand gestures even with single hand gestures, effectively enhancing user experience for a user. The camera sensor refers to one or more cameras on the electronic device such as a back camera, a front camera and so on.
[0018] In an embodiment, the electronic device (apparatus) is a mobile phone, a tablet, a personal digital assistant, a laptop, a wearable device or any other electronic device.
[0019] Referring now to the drawings, and more particularly to FIGS. 1 through 7, where similar reference characters denote corresponding features consistently throughout the figures, there are shown embodiments.
[0020] FIG. 1 illustrates a plurality of components of an electronic device 100 for user interaction with the electronic device 100 through a sensor unit that includes the camera sensor and the Heart rate Monitor (HRM) sensor, according to embodiments as disclosed herein.
[0021] Referring to figure 1, the electronic device 100 is illustrated in accordance with an embodiment of the present subject matter. In an embodiment, the electronic device 100 may include at least one processor 102, an input/output (I/O) interface 104 (herein a configurable user interface), a memory 106. The at least one processor 102 may be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, state machines, logic circuitries, and/or any devices that manipulate signals based on operational instructions. Among other capabilities, the at least one processor 102 is configured to fetch and execute computer-readable instructions stored in the memory 106.
[0022] The I/O interface104 may include a variety of software and hardware interfaces, for example, a web interface, a graphical user interface such as a display screen, a camera interface for the camera sensor (such as the back camera and the front camera on the electronic device 100), the HRM interface for a HRM sensor and the like. The camera sensor and the HRM sensor together provide the sensor unit on the electronic device 100. Thus, the interface for the camera sensor and the HRM sensor enables interaction of the user with the electronic device 100 through one or more gestures. The I/O interface 104 may allow the electronic device 100 to communicate with other devices. The I/O interface 104 may facilitate multiple communications within a wide variety of networks and protocol types, including wired networks, for example, Local Area network (LAN), cable, etc., and wireless networks, such as Wireless LAN, cellular, Device to Device (D2D) communication network, Wi-Fi networks and so on. The modules 108 include routines, programs, objects, components, data structures, and so on, which perform particular tasks, functions or implement particular abstract data types. In one implementation, the modules 108 may include a device operation module 110. The device operation module 110 can be configured to allow the user to handle one or more tasks of the application by interacting with the electronic device 100 through the sensor unit. The device operation module 110 can be configured to identify one or more of gestures detected by the sensor unit including the camera sensor and the HRM sensor of the electronic device 100. In an embodiment, one or more gestures can include a single gesture on the camera sensor, a plurality of gestures on the camera sensor, a single gesture on the HRM sensor, a plurality of gestures on the HRM sensor or a combination of one or more gestures on the HRM sensor and one or more gestures on the camera sensor. Upon detection of one or more gestures, the device operation module can be configured to execute one or more tasks corresponding to the application on the electronic device 100 in accordance with one or more gestures, so identified. A mapping of set of gestures against the task to be performed can be maintained in the memory 106.
[0023] In an embodiment, the gesture detected by the camera sensor or the HRM sensor of the sensor unit may include a left swipe, a right swipe, a single tap, a double tap, a long press, a short press, a vertical scroll in upward or downward direction and so on that can be detected by the camera sensor or the HRM sensor.
[0024] The use case examples for tasks that can be performed using one or more gestures on the camera sensor and the HRM sensor of the electronic device 100 in accordance with one or more gestures detected are explained in conjunction with Figs. 3 to 6.
[0025] The modules 108 may include programs or coded instructions that supplement applications and functions of the electronic device 100. The data 112, amongst other things, serves as a repository for storing data processed, received, and generated by one or more of the modules 108. Further, the names of the other components and modules of the electronic device 100 are illustrative and need not be construed as a limitation.
[0026] FIG. 2 is a flow diagram illustrating a method 200 for user interaction with the electronic device 100 through the sensor unit that includes the camera sensor and the HRM sensor, according to embodiments as disclosed herein. At step 202, the method 200 includes allowing the device operation module 110 to identify one or more gestures detected by of the sensor unit that includes the camera sensor and the HRM sensor of the electronic device 100. In an embodiment, one or more gestures can include the single gesture on the camera sensor, plurality of gestures on the camera sensor, the single gesture on the HRM sensor, plurality of gestures on the HRM sensor or the combination of one or more gestures on the HRM sensor and one or more gestures on the camera sensor. In an embodiment, the camera sensor may be the back camera, the front camera or any other camera on the electronic device 100. Upon detection of the sequence of gestures, at step 204 the method 200 allows the device operation module 110 to execute one or more tasks corresponding to the application on the electronic device 100 in accordance with one or more gestures, so identified. The mapping of set of gestures against the task to be performed can be maintained in the memory 106 corresponding to plurality of applications on the electronic device 100. An example mapping is provided in a table 1 below.
Table 1:

Application Camera Sensor HRM
Sensor Sample Use case
(Tasks performed)
Camera
(Backcamera active) Double Tap x Switch to Front Camera
Camera
(Backcamera active) Long Press
(Close) x Preview the image
Camera
(Backcamera
active) Finger
Swipe up x Shows the latest Front cam images
in that session
Camera
(Backcamera
active) Finger
Swipe down x Shows the latest Back cam images
in that session
Camera
(Back camera
in preview mode) Long Press
(Close) Single Tap Move to next Image in
Preview mode
Camera
(Back camera in
preview mode) Long Press
(Close) Double Tap Change the background image
of existing preview image
Camera
(Back camera
in preview mode) Long Press
(Close) Long Press Share the current preview image
Camera
(Back camera
in preview mode) Finger
Swipe left x Preview move the image to left
Camera
(Back camera
in preview mode) Finger
Swipe right x Preview move the image to right
Camera
(Back camera
in preview mode) Finger
Swipe up x Preview image direct share

Camera
(Back camera
in preview mode) Finger
Swipe down x Preview image direct delete
Gallery Front Cam
touch x Show front camera images
Gallery Back Cam
touch x Show back camera images
Gallery Back Cam
touch Long Press Show the best shots
Gallery x Single Tap Select the category for sorting
the images
Gallery x Double Tap Change the focus of category
folders
Gallery x Long Press Start the slide show for
the focused category
Smart Stay Front Camera x Take the screen shot of the
current screen
Video Capture
(During the video capture) x Long press During the video capture start
clip (subset from main video)
Video Capture
(After the video capture) x Long Press Show all the clipped videos
Video Capture
After the video capture) x Single Tap Highlight the next clipped video
Video Capture
(After the video capture) x Double Tap Share the current highlighted
clipped video
Calendar x Single Tap Current Next Schedule
Calendar x Double Tap Current Day Schedule
Calendar x Long Press Follow-up Schedule list

[0027] A gesture performed by the user can be detected by the camera sensor and the HRM sensor when these sensors are active after certain operations of the electronic device. For example, on power-on operation, activation a particular application such as a camera app or a gallery app of the electronic device 100 the camera sensor and the HRM sensor remain active for a predefined time (generally up to few seconds).
[0028] In one implementation of the method 200 in the electronic device 100, the camera sensor and the HRM sensor may be placed adjacent to each other such that both of them are accessible-to or are within- reach of finger movement of a single hand of the user. The various actions in method 200 may be performed in the order presented, in a different order or simultaneously. Further, in some embodiments, some actions listed in FIG. 2 may be omitted.
[0029] Plurality of use case examples for tasks that can be performed using one or more gestures on the camera sensor and the HRM sensor as proposed by the method 200 are explained in conjunction with Figs. 3 to 6 with illustrations.
[0030] FIG. 3 is an example illustrating one or more tasks of the application, executed based on one or more gestures detected by the camera sensor and the HRM sensor, according to embodiments as disclosed herein. The FIG. 3 illustrates the electronic device100 with a sensor unit 300. A camera sensor 302, which is a back camera and a HRM sensor 304 placed adjacent to each other as the sensor unit 300. Generally, in the electronic device 100, when a device screen is turned ON, the camera sensor 302 and the HRM sensor 304 are activated for few seconds. During this active period of the camera sensor 302 and the HRM sensor 304, if the user places a finger 306 on camera sensor 302 for few seconds, a menu 310 corresponding to a quick launch application appears on the display screen. The user can scroll the menu dialog using HRM sensor 304 with a swipe up/down gesture using a finger 308. Further, a single tap gesture (second gesture in the sequence of gestures performed) to the swipe up/down gesture on the HRM sensor 304 can launch 314 the application ‘SHealth’ selected, on the display screen. Thus, the camera sensor and the HRM sensor are easily accessed by single hand of the user for performing plurality of tasks corresponding to the application of interest.
[0031] FIG. 4 is an example illustrating one or more tasks of an application, executed based on one or more gestures detected by the camera sensor and the HRM sensor, according to embodiments as disclosed herein. The FIG. 4 illustrates the electronic device 100 with a camera application 400 active on the display screen of the electronic device 100. The user may close a camera sensor 404 of a sensor unit 402 for few seconds and a first task of the camera application such as camera setting 408 is launched. Further, the user performs a swipe gesture on the HRM sensor 406 of the sensor unit 402 to scroll between different modes of the camera settings and select the mode. Further a single tap on HRM sensor 406 (third gesture in the sequence of gestures performed) can apply highlighted or selected mode for the camera application to perform a second task corresponding to the camera application. Thus, the camera sensor and the HRM sensor are easily accessed by single hand of the user for performing plurality of tasks corresponding to the application of interest using the single handed operation mode.
[0032] FIG. 5a and FIG. 5b are example tasks executed based on one or more gestures detected by the camera sensor and the HRM sensor, according to embodiments as disclosed herein.
[0033] The FIG. 5a illustrates the electronic device 100, with a camera application active in a selfie mode 500. The user takes a selfie with one hand by performing a click (tap), a gesture, on a HRM sensor 504 of a sensor unit 502. Further, to see preview of the selfie taken, the user closes the camera sensor (back camera) 506. The back camera 506 can be used as it remains active for few seconds if a selfie (image of the user himself) is clicked with the front camera of the electronic device 100. Further, a long press on the HRM 508 can execute a task of the camera application such as sharing the selfie image on desired location such as a social networking site and so on. However, a double tap on the HRM sensor 508 can delete the selfie image. Another gesture such as closing the camera sensor 506 followed by a click on the HRM sensor 508 may show a preview of all front camera images.
[0034] As depicted in FIG. 5b, a gallery application 508 is open on the display screen of the electronic device 100. Since the camera sensor (not shown) remains active for few second when gallery application is opened, the user places the finger on camera sensor and only recently taken pictures are displayed 508. Further, a single tap on the camera sensor provides rotation of pictures (images) taken using back camera, front camera and recently taken pictures. Further, a swipe on the HRM sensor (not shown) can allow scrolling between the images displayed. A single tap on the HRM sensor can share the highlighted picture or a double tap on the HRM sensor can set the selected image as wallpaper.
[0035] FIG. 6 is an example illustrating one or more tasks of an application, executed based on one or more gestures detected by the camera sensor and the HRM sensor, according to embodiments as disclosed herein. As depicted in the figure, the electronic device 100 is an edge device capable of displaying content on edge 602 of the electronic device 100. Whenever, the power key of the electronic device 100 is pressed to turn on the electronic device 100, a HRM sensor 604 of a sensor unit 600 is active for few seconds. A gesture such as a tap on the HRM sensor 604 displays a first preset color, on the edge 602, corresponding to a first favorite contact in a favorite contact list. Thus, a first task corresponding to ‘favorite contact’ application is executed. Further successive taps on the HRM sensor 604, switches the colors displayed on the edge 602, indicating a color corresponding to the next favorite contact currently selected. Once a desired color corresponding to the desired favorite contact appears, the user can close a camera sensor 606 of the sensor unit 600. The gesture on the camera sensor 606 can send a preconfigured message to the desired favorite contact. By using other pre configured gestures such as double tap a second preconfigured message can be sent. The displayed color darkens or blinks to confirm the message sent. Thus, the camera sensor and the HRM sensor are easily accessed by single hand of the user for performing plurality of tasks corresponding to the application of interest using the single handed operation mode.
[0036] FIG. 7 illustrates a computing environment implementing the method for user interaction with the electronic device through a sensor unit, as disclosed in the embodiments herein. As depicted, the computing environment 702 comprises at least one processing unit 704 that is equipped with a control unit 706 and an Arithmetic Logic Unit (ALU) 708, a memory 710, a storage unit 712, plurality of networking devices 714 and a plurality Input output (I/O) devices 716. The processing unit 704 is responsible for processing the instructions of the algorithm. The processing unit 704 receives commands from the control unit 706 in order to perform its processing. Further, any logical and arithmetic operations involved in the execution of the instructions are computed with the help of the ALU 708.
[0037] The overall computing environment 702 can be composed of multiple homogeneous and/or heterogeneous cores, multiple CPUs of different kinds, special media and other accelerators. The processing unit 704 is responsible for processing the instructions of the algorithm. Further, the plurality of processing units 704 may be located on a single chip or over multiple chips.
[0038] The algorithm comprising of instructions and codes required for the implementation are stored in either the memory unit 710 or the storage 712 or both. At the time of execution, the instructions may be fetched from the corresponding memory 710 and/or storage 712, and executed by the processing unit 704. In case of any hardware implementations various networking devices 714 or external I/O devices 716 may be connected to the computing environment to support the implementation through the networking unit and the I/O device unit. The embodiments disclosed herein can be implemented through at least one software program running on at least one hardware device and performing network management functions to control the network elements. The network elements shown in FIG. 1 through FIG. 7 include blocks which can be at least one of a hardware device, or a combination of hardware device and software module.
[0039] The foregoing description of the specific embodiments will so fully reveal the general nature of the embodiments herein that others can, by applying current knowledge, readily modify and/or adapt for various applications such specific embodiments without departing from the generic concept, and, therefore, such adaptations and modifications should and are intended to be comprehended within the meaning and range of equivalents of the disclosed embodiments. It is to be understood that the phraseology or terminology employed herein is for the purpose of description and not of limitation. Therefore, while the embodiments herein have been described in terms of preferred embodiments, those skilled in the art will recognize that the embodiments herein can be practiced with modification within the spirit and scope of the embodiments as described herein.

Documents

Application Documents

# Name Date
1 201641006838-IntimationOfGrant24-08-2023.pdf 2023-08-24
1 Form 5 [26-02-2016(online)].pdf 2016-02-26
2 201641006838-PatentCertificate24-08-2023.pdf 2023-08-24
2 Form 3 [26-02-2016(online)].pdf 2016-02-26
3 Form 18 [26-02-2016(online)].pdf 2016-02-26
3 201641006838-ABSTRACT [14-08-2020(online)].pdf 2020-08-14
4 Drawing [26-02-2016(online)].pdf 2016-02-26
4 201641006838-CLAIMS [14-08-2020(online)].pdf 2020-08-14
5 Description(Complete) [26-02-2016(online)].pdf 2016-02-26
5 201641006838-CORRESPONDENCE [14-08-2020(online)].pdf 2020-08-14
6 abstract 201641006838.jpg 2016-06-15
6 201641006838-DRAWING [14-08-2020(online)].pdf 2020-08-14
7 201641006838-OTHERS-Grant Of Patent-030616.pdf 2016-07-21
7 201641006838-FER_SER_REPLY [14-08-2020(online)].pdf 2020-08-14
8 201641006838-Power of Attorney-210616.pdf 2016-07-26
8 201641006838-OTHERS [14-08-2020(online)].pdf 2020-08-14
9 201641006838-FER.pdf 2020-02-18
9 201641006838-Form 5-210616.pdf 2016-07-26
10 201641006838-Correspondence-F5-PA-210616.pdf 2016-07-26
10 201641006838-FORM-26 [16-03-2018(online)].pdf 2018-03-16
11 201641006838-FORM-26 [16-03-2018(online)]_88.pdf 2018-03-16
12 201641006838-Correspondence-F5-PA-210616.pdf 2016-07-26
12 201641006838-FORM-26 [16-03-2018(online)].pdf 2018-03-16
13 201641006838-FER.pdf 2020-02-18
13 201641006838-Form 5-210616.pdf 2016-07-26
14 201641006838-OTHERS [14-08-2020(online)].pdf 2020-08-14
14 201641006838-Power of Attorney-210616.pdf 2016-07-26
15 201641006838-FER_SER_REPLY [14-08-2020(online)].pdf 2020-08-14
15 201641006838-OTHERS-Grant Of Patent-030616.pdf 2016-07-21
16 201641006838-DRAWING [14-08-2020(online)].pdf 2020-08-14
16 abstract 201641006838.jpg 2016-06-15
17 201641006838-CORRESPONDENCE [14-08-2020(online)].pdf 2020-08-14
17 Description(Complete) [26-02-2016(online)].pdf 2016-02-26
18 201641006838-CLAIMS [14-08-2020(online)].pdf 2020-08-14
18 Drawing [26-02-2016(online)].pdf 2016-02-26
19 Form 18 [26-02-2016(online)].pdf 2016-02-26
19 201641006838-ABSTRACT [14-08-2020(online)].pdf 2020-08-14
20 Form 3 [26-02-2016(online)].pdf 2016-02-26
20 201641006838-PatentCertificate24-08-2023.pdf 2023-08-24
21 Form 5 [26-02-2016(online)].pdf 2016-02-26
21 201641006838-IntimationOfGrant24-08-2023.pdf 2023-08-24

Search Strategy

1 search_31-01-2020.pdf

ERegister / Renewals