Sign In to Follow Application
View All Documents & Correspondence

Method Of Performing A Touch Action In A Touch Sensitive Device

Abstract: The present invention discloses method of performing a touch action in a touch sensitive device. The method comprises detecting a shape of a contact area associated with a touch input provided on a touch screen, determining whether the identified shape is valid based on a pre-determined criteria, detecting a gesture based on a movement of the validated shape, determining whether the detected gesture is valid by matching the gesture with one or more pre-defined gestures and performing the touch action. Likewise, if the touch input is hand shape, then the shape of a contact area associated with a touch input provided on a touch screen is identified based on one or more pre-defined parameters. Further, an orientation of the identified shape is determined. Then the identified shape is validated based on pre-determined criteria. If the identified shape is valid, then the touch action is performed on the touch sensitive device. FIGURE 12

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
07 July 2014
Publication Number
08/2016
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
bangalore@knspartners.com
Parent Application
Patent Number
Legal Status
Grant Date
2021-08-31
Renewal Date

Applicants

SAMSUNG R&D INSTITUTE INDIA – BANGALORE Pvt. Ltd.
# 2870, ORION Building, Bagmane Constellation Business Park, Outer Ring Road, Doddanakundi Circle, Marathahalli Post, Bangalore -560037, Karnataka, India Indian Company

Inventors

1. DEOTALE, Gunjan P
Employed at Samsung R&D Institute India – Bangalore Private Limited, having its office at, # 2870, ORION Building, Bagmane Constellation Business Park, Outer Ring Road, Doddanakundi Circle, Marathahalli Post, Bangalore -560037, Karnataka, India
2. THAJUDEEN, Niyas Ahmed Sulthar
Employed at Samsung R&D Institute India – Bangalore Private Limited, having its office at, # 2870, ORION Building, Bagmane Constellation Business Park, Outer Ring Road, Doddanakundi Circle, Marathahalli Post, Bangalore -560037, Karnataka, India
3. VAISH, Rahul
Employed at Samsung R&D Institute India – Bangalore Private Limited, having its office at, # 2870, ORION Building, Bagmane Constellation Business Park, Outer Ring Road, Doddanakundi Circle, Marathahalli Post, Bangalore -560037, Karnataka, India
4. BHAMIDIPATI, Sreevatsa Dwaraka
Employed at Samsung R&D Institute India – Bangalore Private Limited, having its office at, # 2870, ORION Building, Bagmane Constellation Business Park, Outer Ring Road, Doddanakundi Circle, Marathahalli Post, Bangalore -560037, Karnataka, India
5. POKHREL, Niroj
Employed at Samsung R&D Institute India – Bangalore Private Limited, having its office at, # 2870, ORION Building, Bagmane Constellation Business Park, Outer Ring Road, Doddanakundi Circle, Marathahalli Post, Bangalore -560037, Karnataka, India
6. KIM, Changjin
Samsung Electronics Co.,LTD, System Software R&D Team, 416,Maetan-3Dong,Yeongtong-Gu, Suwon-City,Gyeonggi-Do-443-742, Korea.
7. KIM, Byeongjae
Samsung Electronics Co.,LTD, System Software R&D Team, 416,Maetan-3Dong,Yeongtong-Gu, Suwon-City,Gyeonggi-Do-443-742, Korea.
8. KWON, Jungtae
Samsung Electronics Co.,LTD, System Software R&D Team, 416,Maetan-3Dong,Yeongtong-Gu, Suwon-City,Gyeonggi-Do-443-742, Korea.
9. KIM, Namyun
Samsung Electronics Co.,LTD, System Software R&D Team, 416,Maetan-3Dong,Yeongtong-Gu, Suwon-City,Gyeonggi-Do-443-742, Korea.

Specification

DESC:

FORM 2
THE PATENTS ACT, 1970
[39 of 1970]
&
THE PATENTS RULES, 2003
COMPLETE SPECIFICATION
(Section 10; Rule 13)

METHOD OF PERFORMING A TOUCH ACTION IN A TOUCH SENSITIVE DEVICE

SAMSUNG R&D INSTITUTE INDIA – BANGALORE PRIVATE LIMITED
# 2870, ORION Building,
Bagmane Constellation Business Park,
Outer Ring Road, Doddanakundi Circle,
Marathahalli Post, Bangalore-560 037
an Indian Company

The following specification particularly describes the invention and the manner in which it is to be performed

RELATED APPLICATION

Benefit is claimed to Indian Provisional Application No. 3353/CHE/2014 titled “A METHOD OF IDENTIFYING THE HAND SHAPES AND GESTURES IN TOUCH AND HOVER SENSITIVE DEVICES” filed on 07 July 2014, which is herein incorporated in its entirety by reference for all purposes.

FIELD OF THE INVENTION

The present invention generally relates to the field of touch screen and more particularly relates to method and system for identifying a gesture to enable one or more device specific actions in a touch screen enabled device.

BACKGROUND OF THE INVENTION

The touch screen of different types of electronic device enables to detect the co-ordinates of point of touch and performs the pre-defined action. With touch screen, the movement of the input pointer on a display can correspond to the relative movements of the user's finger as the finger is moved along a surface of the touch screen. Likewise, the hand gestures have been implemented in touch screens such as selections can be made when one or more taps can be detected on the surface of the touch screen. In some cases, any portion of the touch screen can be tapped, and in other cases a dedicated portion of the touch screen can be tapped. In addition to selections, scrolling can be initiated by using finger motion at the edge of the touch screen.
In recent times, more advanced gestures have been implemented. For example, scrolling can be initiated by placing four fingers on the touch screen so that the scrolling gesture is recognized, and thereafter moving these fingers on the touch screen to perform scrolling events. The methods for implementing these advanced gestures, however, can be limited and in many instances counterintuitive. In certain applications, it can be beneficial to enable a user to use “real-world” gestures such as hand movements and/or finger orientations that can be generally recognized to mean certain things to more efficiently and accurately effect intended operations.
However, there exist requirements of pre-learning or training of a model for different types of inputs in order to enable a specific gesture. This grabs the storage of the device. Hence, there exists a need of a method and system for identifying a gesture to enable one or more device specific actions in a touch screen enabled device using the shape and orientation of touch input.

SUMMARY

An objective of the present invention is to provide a method of identifying gesture to enable device specific actions in touch screen enabled devices using one or more parameters. The gesture can be mere hand shape or a touch pattern.

The method of performing a touch action in a touch sensitive device using hand shapes comprises the following steps. The shape of a contact area associated with a touch input provided on a touch screen of the touch sensitive device is identified based on one or more pre-defined parameters. Further, an orientation of the identified shape is determined. Then the identified shape is validated based on pre-determined criteria. If the identified shape is valid, then the touch action is performed on the touch sensitive device.

Another aspect of present invention discloses method of performing a touch action in a touch-sensitive device using gestures. The method comprises detecting a shape of a contact area associated with a touch input provided on a touch screen of the touch-sensitive device, determining whether the identified shape is valid based on a predetermined criteria, detecting a gesture based on a movement of the validated shape, determining whether the detected gesture is valid by matching the gesture with one or more pre-defined gestures and performing the touch action based on determining that the detected gesture is valid and determining that the identified shape is valid.

BRIEF DESRIPTION OF THE ACCOMPANYING DRAWINGS

The aforementioned aspects and other features of the present invention will be explained in the following description, taken in conjunction with the accompanying drawings, wherein:

Figure 1 is flow diagram illustrating a method of recognizing hand shape for performing a touch action in a touch sensitive device, according to one embodiment of present invention.

Figure 2 is a flow chart illustrating a method of Process mutual capacitance data and computation of parameters, according to one embodiment of present invention.

Figure 3 is a flow chart illustrating a method of data binarization, according to one embodiment of present invention.

Figure 4A is a flow chart illustrating a method of region identification, according to one embodiment of present invention.

Figure 4B is a flow chart illustrating a method of modifying wrongly interpreted region values in the process of region identification, according to one embodiment of present invention.

Figure 5 is a schematic representation of determining orientation of shape using average angle method, according to one embodiment of present invention.

Figure 6 is a flow chart illustrating a method of identifying various parts of hand, separating joint fingers, according to one embodiment of present invention.

Figure 7 is a flow chart illustrating a method of identifying fist, according to one embodiment of present invention.

Figure 8 is a flow chart illustrating a method of identifying finger, according to one embodiment of present invention.

Figure 9 is a flow chart illustrating a method of separating finger, according to one embodiment of present invention.

Figure 10 is a flow chart illustrating a method of storing shapes, according to one embodiment of present invention.

Figure 11 is a flow chart illustrating method of shape matching using calculated parameter and recorded parameters, according to one embodiment of present invention.

Figure 12 is a flow chart illustrating method of performing a touch action in a touch-sensitive device, according to another embodiment of present invention.

Figure 13 is a flow diagram illustrating a method of computing parameters and linearizing the gesture performed by the user, according to one embodiment of present invention.

Figures 14A and 14B are schematic representations of performing matching a sample gesture containing two edges, according to one embodiment of present invention.

Figure 15A is a flow diagram illustrating a method of computing one or more parameters for matching, according to another embodiment of present invention.

Figure 15B is a flow diagram illustrating a method of processing the one or more coordinate collected in the TSP IC, according to another embodiment of present invention.

Figure 15C is a flow diagram illustrating a method of matching the gesture performed by the user with the registered gesture, according to one embodiment of present invention.

Figure 16A is a flow diagram illustrating a method of computing pressure vertex and direction vertex for determining the gesture performed by the user, according to one embodiment of present invention.

Figure 16B is a flow diagram illustrating a method of matching the gesture performed by the user with the recorded gesture, according to one embodiment of present invention.

Figure 17 is a flow diagram illustrating a method of registering one or more gesture with the touch sensitive device, according to one embodiment of present invention.

DETAILED DESCRIPTION OF THE INVENTION

The embodiments of the present invention will now be described in detail with reference to the accompanying drawings. However, the present invention is not limited to the embodiments. The present invention can be modified in various forms. Thus, the embodiments of the present invention are only provided to explain more clearly the present invention to the ordinarily skilled in the art of the present invention. In the accompanying drawings, like reference numerals are used to indicate like components.

The specification may refer to “an”, “one” or “some” embodiment(s) in several locations. This does not necessarily imply that each such reference is to the same embodiment(s), or that the feature only applies to a single embodiment. Single features of different embodiments may also be combined to provide other embodiments.

As used herein, the singular forms “a”, “an” and “the” are intended to include the plural forms as well, unless expressly stated otherwise. It will be further understood that the terms “includes”, “comprises”, “including” and/or “comprising” when used in this specification, specify the presence of stated features, integers, steps, operations, elements and/or components, but do not preclude the presence or addition of one or more other features integers, steps, operations, elements, components, and/or groups thereof. It will be understood that when an element is referred to as being “connected” or “coupled” to another element, it can be directly connected or coupled to the other element or intervening elements may be present. Furthermore, “connected” or “coupled” as used herein may include operatively connected or coupled. As used herein, the term “and/or” includes any and all combinations and arrangements of one or more of the associated listed items.

Unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this disclosure pertains. It will be further understood that terms, such as those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of the relevant art and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein.

Figure 1 is flow diagram illustrating a method of recognizing hand shape for performing a touch action in a touch sensitive device, according to one embodiment of present invention. When a user touches a screen, mutual capacitance changes and the change in the mutual capacitance is detected by the touch sensor panel integrated circuit (TSP IC) (i.e. touch controller) at step 102. The shape of the touch area is considered as a region. The detected mutual capacitance data for the region is processed at step 103 and set of parameters are calculated. Based on the calculated parameters, the shape is identified at step 104. The identified shape matches with the pre-defined or pre-recorded shapes in the touch sensitive device at step 105. According to one embodiment of present invention, the shape pattern can have combination of multiple shapes. Further, at step 106, a touch action corresponding to the identified shape is enabled in the touch sensitive device.

The set of parameters according to one embodiment of present invention comprises one or more of an area of a region, a width of a region, a height of a region, a right-left slant length of a region, a left-right slant length of a region, a number of touch nodes enclosed in a region, a hypotenuse of a region, a rectangularity of a region, an elongatedness of a region and an average angle of a region.

In one embodiment of present invention, the region is identified based on detecting a change in the mutual capacitance in the touch screen. The mutual capacitance data is processed and the one or more parameters are defined based on the identified one or more touch regions. The method of processing the mutual capacitance data is explained in detail in figure 2. Once the parameters are identified, the value of each of the defined parameters are determined for the identified one or more touch regions. The shape of the region is identified based on the determined values of the defined parameters.

Figure 2 is a flow chart illustrating a method to process mutual capacitance data and computation of parameters, according to one embodiment of present invention.In order to identify the shape of the contact area, the mutual capacitance data is processed. The processing includes filtering of mutual capacitance data using a binarization method as indicated in step 201. The method of data binarization is explained in detail in figure 3. Further, the one or more touch regions on the touch screen are identified using a region identification method at step 202. Further, the parameters are computed for identified region.

Figure 3 is a flow chart illustrating a method of data binarization, according to one embodiment of present invention. According to one embodiment of present invention, coupling capacity of two-crossing conductor/electrodes a path of touch sensor panel is considered as node. The change in mutual capacitance at each of the node is calculated. A pre-defined threshold value is selected for performing data binarization. Further, as indicated in step 301, the change in mutual capacitance at the current node is compared with the pre-defined threshold value of the mutual capacitance. If the mutual capacitance value of the current node is greater than the pre-defined threshold, the mutual capacitance value of the current node is set as 1 (at step 303). If the mutual capacitance value of the current node is less than the pre-defined threshold, the mutual capacitance value of the current node is set as 0 (at step 302).

Figure 4A is a flow chart illustrating a method of region identification, according to one embodiment of present invention. In the present invention, the identification of the touch region is performed using mutual capacitance touch data array. The mutual capacitance touch data array is formed using the binarised data. In order to identify the region of touch, the following initial conditions are given at step 401.

Region value = 2
Connected region value = 0

At step 402, it is determined that all the nodes in the mutual capacitance touch data array is checked. the nodes are also referred as points. If all the nodes in the mutual capacitance touch data array are checked, then process of identifying the region terminates as shown in step 403.The control flow is transferred to flow diagram in figure 4B. If all the nodes in the mutual capacitance touch data array are not checked, then at step 404, it is determined whether the current value of the node is 1. If the current value of mutual capacitance of the node is 1, then it is determined whether at least one value around the current node in all eight directions greater than one, at step 406. If there exists one or more nodes having mutual capacitance value greater than one, then the connected region value is updated with the greatest node value obtained from surrounding current node (step 406). Likewise, at step 407, the current node value and all the nodes surrounding the current node are updated by the updated connected region value. If there are no surrounding nodes having mutual capacitance value greater than 1, then the connected region value is updated with the initial region value as indicated in step 408. Further, the region value is incremented and the control flow is directed towards step 407. Upon completion of updation of connected region value, the next node in the mutual capacitance data array is selected at step 410.

Figure 4B is a flow chart illustrating a method of modifying wrongly interpreted region values in the process of region identification, according to one embodiment of present invention. In order to rectify the wrong interpretation of region values, the initial value of the current region is set as 2 (step 411). Further, it is determined that the current region value is greater than the final region value at step 412. The final region value is the final value of the connected region value of step 408 in figure 4A. If the current region value is greater than the final region value, then the process is terminated as indicated in step 413. If the current region value is less than the final region value, then the control flow is transferred to the beginning of the mutual capacitance data array at step 414. Likewise, at step 415, it is determined whether all nodes in the mutual capacitance data array are checked. If all nodes in the mutual capacitance data array are checked, then at step 416, the current region value is incremented. If all nodes in the mutual capacitance data array are not checked, then it is determined whether the value of the mutual capacitance at the current node is equal to the current region value at step 418. If the value of the mutual capacitance at the current node is equal to the current region, it is determined whether the value around current node in surrounding 8 directions is greater than zero and not equal to current region value. if yes, then the error region value is set as value obtained in the previous processing (step 420). Consequently, at step 421, the all nodes in the mutual capacitance data array that are equal to error region value are set as current region value.

Figure 5 illustrates a schematic representation of calculation of Parameters for shape recognition, according to one embodiment of present invention. The various parameters to be calculated in order to determine the shape of the touch region are one or more of an area of a region, a width of a region, a height of a region, a right-left slant length of a region, a left-right slant length of a region, a number of touch nodes enclosed in a region, a hypotenuse of a region, a rectangularity of a region, an elongatedness of a region and an average angle of a region.

In order to find minimum bounding rectangle, a method for calculating bounding box is used.

The area of bounding box = (Width X Height) of bounding box.
Perimeter of bounding box = 2X(Width+Height) of bounding box.

Centroid of shape(x,y) =(?¦xi /n, ?¦yi /n) , where i ? [1,n], n is total number of points detected.
xi: x coordinates of points inside the shape.
yi: y coordinates of points inside the shape
Area of shape = Total number of points touched inside the shape.
Perimeter of shape = Total number of points covered on border of shape.

In order to identify the orientation of the shape average angle method is used. According to one embodiment of present invention, straight lines are drawn inside the shape at different angles and each line above threshold and then find average of all the angles.

The average angle method (the angle of line)

Average angle of shape = ?¦ai /n, where i ? [1,n], n is total number of points detected
The steps involved in determining left to right width:
Step 1: find the point of shape closest to left-top point of the bounding box
Step 2: find the point of shape closest to right-bottom point of the bounding box
Step 3: Count all the points lying on the line joining the points mentioned in Steps 1 and 2.
The steps involved in determining right to left width.
Step 1: find the point of shape closest to right-top point of the bounding box
Step 2: find the point of shape closest to left-bottom point of the bounding box
Step 3: Count all the points lying on the line joining the points mentioned in step 1 and 2.

Figure 6 is a flow chart illustrating a method of identifying various parts of hand, separating joint fingers, according to one embodiment of present invention. In one embodiment of present invention, the fist identification is performed at step 601. A method of identifying fist is explained in detail in Figure 7A. Further, at step 602 it is determined that whether the fist is present in the touch region. If the fist is present in the touch region, then the fist count is increased and all points in identified region are reset to 0. If fist is not identified in the touch region, then the finger identification is performed at step 604. The method of identifying finger is explained in detail in Figure 8A. If finger I present in the touch region, then finger count is incremented and all the point present in the touch region are rest to 0 (step 606). If the finger is not present in the touch region, then it is determined at any undefined region is present in the touch region. If no undefined region is present in the touch region, then the process terminates (step 608). If the at least one undefined region is detected, then it is determined whether the current unidentified shape is different from unidentified shape pre-defined shapes at step 609. If the current unidentified shape is not matching with the pre-recorded shapes, then the palm count is increased and the process terminated as indicated in step 610. If the current in the identified shape matches with the pre-recorded shapes, then at step 611, the finger identification is performed.

Figure 7A is a flow chart illustrating a method of identifying fist, according to one embodiment of present invention. In order to identify fist, the parameters such as the height to width ratio, left to right length, and perimeter of the bounding box are used. A pre-defined range is set of each of the parameters. According to one exemplary embodiment of present invention, at step 701, the height to width ratio is calculated. In order to identify the shape of a fist, the height to width ratio must be within a constant range. In one embodiment of present invention, the ideal height to width ratio is approximately 3. If the height to width ratio falls in the pre-defined ratio, then the left to right ratio and right to left ratio are calculated at step 702. Further it is determined that the left to right length and right to left length fall within a pre-defined range (approximately 1.3 if LR>RL or 0.75 if RL>LR) at step 702. Then the perimeter of shape or perimeter of bounding box is determined. If the perimeter of shape or perimeter of bounding box falls in a constant range fixed for fist (approximately 0.75), then the area of the bounding box and the area cover by the hand shape are computed at step 703. At step 704, the ratio of area bounding box and area covered by the hand shape is computed. If the computed ratio falls within a pre-defined range, then the fist is identified at step 705. The fist is not identified if any of the above calculated parameters do not fall in the pre-defined range. The pre-defined ranges are set based on a study conducted on hands of various people to identify possible measures of fist.

Figure 7B illustrates an exemplary shape of fist along with few parameters such as height, width, left to right length and right to left length.

Figure 8A is a flow chart illustrating a method of identifying finger, according to one embodiment of present invention. In one exemplary embodiment of present invention, the right to left diagonal length, and left to right diagonal length of the hand shape are computed to identify the finger. At step 801, it is determined that maximum of left to right diagonal length and right to left diagonal length fall within a pre-defined range fixed for finger. In an exemplary embodiment, the pre-defined range is 2.5. If the maximum of left to right diagonal length and right to left diagonal length fall within a pre-defined range, at step 802, the perimeter of shape or perimeter of bounding box is computed and determined whether the perimeter of shape or perimeter of bounding box lies within range fixed for finger at step 803. In an exemplary embodiment of present invention, the approximate range is set as approximately 1. Further, at step 804, the ratio between the area of bounding box and area covered by the finger region is calculated. If the ratio falls between pre-defined ranges, then the finger is identified at step 805. In one embodiment of present invention, the pre-defined range of ratio between the area of bounding box and area covered by the finger region is set approximately as 0.7 to 0.9. In case any of the above mentioned ratio of parameters do not fall within the corresponding pre-defined ratio, then the process terminates without identifying the finger, as indicated in step 806.

Figure 8B indicates various parameters such as height, width, left to right diagonal length and right to left diagonal length corresponding to shape of a finger.

Figure 9 illustrates a flow chart of a method of separating/differentiating finger from palm, according to one embodiment of present invention. In one embodiment of present invention, at step 901 the bounding box for the touch region is obtained. At step 902, the first encountered region from top left is obtained. The first encountered region in view of present embodiment may be the left most finger. Further, traverse all points of the encountered region along width of bounding box and store its length at step 903. Then at step 904, the next row of bounding box is selected. Further, count the length of previously encounter region in current row at step 905. Then the absolute value of difference between the previously stored length and the length of the current region is calculated. At step 906, the difference is referred as Delta. At step 907, it is determined whether delta value is greater than a pre-defined threshold value. In one embodiment of present invention, the pre-defined threshold is set as 4. In parallel, the length of encountered region in the current row is set as zero. If the value of delta is less than the pre-defined threshold, then control flow transfers to step 904, for selecting next row of the bounding box. If the value of delta is higher than the pre-defined threshold value, then at step 908, all values in the current row is reset to zero and the process terminates.

Figure 9B is schematic representation of the various steps present in the separating finger from the palm according to one embodiment of present invention. At step 1, the finger is separated from the palm. Likewise, a bounding box is created for the rest of the touch region at steps 2 & 3 and identifies each of the finger and palm at step 4.

Figure 10 is a flow chart illustrating a method of storing shapes, according to one embodiment of present invention. In order to record one or more shapes in the touch sensitive device for reference the user needs to select the option to record a shape. According to one embodiment of present invention, a blank area is provided in the touch sensitive device for providing an input for recording a gesture using the hand shape at step 1001. The user needs to repeat the same gesture multiple times for recording the minor deviations present in the hand shape while providing the touch input. Each of the touch input frames are stored in the touch sensitive device memory. If the data for consecutive predefined number of frames is similar (step 1002), then at step 1003, mutual capacitance of the data is processed. Further the parameters are computed for the shape. Once the hand shape is validated and parameters are computed, the parameters and corresponding shapes are stored in the touch sensitive device.

Figure 11 is a flow chart illustrating method of shape matching using calculated parameter and recorded parameters, according to one embodiment of present invention. In order to perform the shape matching, at step 1101, it is determined whether the number of region detected is equal to the number of recorded region. If the number of region detected is equal to the number of recorded region, the following absolute values are computed and compared with a pre-defined ratio (?).
ABS(W/LR-RECORDED(W/LR))

Documents

Application Documents

# Name Date
1 3353-CHE-2014-IntimationOfGrant31-08-2021.pdf 2021-08-31
1 SRIB-20140513-006_PS_filied with IPO on 7 June 2014.pdf 2014-07-11
2 3353-CHE-2014-PatentCertificate31-08-2021.pdf 2021-08-31
2 SRIB-20140513-006_drawings_PS_filed on 7 July 2014.pdf 2014-07-11
3 SRIB-20140513-006_drawings filed with IPO on 21 April 2015.pdf 2015-04-21
3 3353-CHE-2014-PETITION UNDER RULE 137 [27-08-2021(online)].pdf 2021-08-27
4 SRIB-20140513-006_complete specification filed with IPO on 21 April 2015.pdf 2015-04-21
4 3353-CHE-2014-PETITION UNDER RULE 138 [26-08-2021(online)].pdf 2021-08-26
5 3353-CHE-2014-Proof of Right [26-08-2021(online)].pdf 2021-08-26
5 3353-CHE-2014 FORM-13 26-05-2015.pdf 2015-05-26
6 SRIB-20140513-006_Form 13_filed with IPO on 26 May 2015.pdf 2015-06-04
6 3353-CHE-2014-Annexure [27-05-2020(online)].pdf 2020-05-27
7 SRIB-20140513-006_drawings _filed with IPO on 26 May 2015.pdf 2015-06-04
7 3353-CHE-2014-CLAIMS [26-05-2020(online)].pdf 2020-05-26
8 SRIB-20140513-006_complete specification_filed with IPO on 26 May 2015_markedup copy.pdf 2015-06-04
8 3353-CHE-2014-COMPLETE SPECIFICATION [26-05-2020(online)].pdf 2020-05-26
9 3353-CHE-2014-DRAWING [26-05-2020(online)].pdf 2020-05-26
9 SRIB-20140513-006_complete specification_filed with IPO on 26 May 2015_clean Copy.pdf 2015-06-04
10 3353-CHE-2014-FER_SER_REPLY [26-05-2020(online)].pdf 2020-05-26
10 SRIB-20140513-006_Form 13_filed with IPO on 26 May 2015.pdf_684.pdf 2015-06-24
11 3353-CHE-2014-OTHERS [26-05-2020(online)].pdf 2020-05-26
11 SRIB-20140513-006_drawings _filed with IPO on 26 May 2015.pdf_678.pdf 2015-06-24
12 3353-CHE-2014-FER.pdf 2019-11-27
12 SRIB-20140513-006_complete specification_filed with IPO on 26 May 2015_markedup copy.pdf_680.pdf 2015-06-24
13 3353-CHE-2014-FORM 13 [05-08-2019(online)].pdf 2019-08-05
13 SRIB-20140513-006_complete specification_filed with IPO on 26 May 2015_clean Copy.pdf_679.pdf 2015-06-24
14 3353-CHE-2014-FORM-26 [03-08-2019(online)].pdf 2019-08-03
14 Request for Certified Copy of 3353CHE2014_CS.pdf 2015-06-24
15 abstract 3353-CHE-2014.jpg 2015-09-11
16 3353-CHE-2014-FORM-26 [03-08-2019(online)].pdf 2019-08-03
16 Request for Certified Copy of 3353CHE2014_CS.pdf 2015-06-24
17 SRIB-20140513-006_complete specification_filed with IPO on 26 May 2015_clean Copy.pdf_679.pdf 2015-06-24
17 3353-CHE-2014-FORM 13 [05-08-2019(online)].pdf 2019-08-05
18 SRIB-20140513-006_complete specification_filed with IPO on 26 May 2015_markedup copy.pdf_680.pdf 2015-06-24
18 3353-CHE-2014-FER.pdf 2019-11-27
19 3353-CHE-2014-OTHERS [26-05-2020(online)].pdf 2020-05-26
19 SRIB-20140513-006_drawings _filed with IPO on 26 May 2015.pdf_678.pdf 2015-06-24
20 3353-CHE-2014-FER_SER_REPLY [26-05-2020(online)].pdf 2020-05-26
20 SRIB-20140513-006_Form 13_filed with IPO on 26 May 2015.pdf_684.pdf 2015-06-24
21 3353-CHE-2014-DRAWING [26-05-2020(online)].pdf 2020-05-26
21 SRIB-20140513-006_complete specification_filed with IPO on 26 May 2015_clean Copy.pdf 2015-06-04
22 3353-CHE-2014-COMPLETE SPECIFICATION [26-05-2020(online)].pdf 2020-05-26
22 SRIB-20140513-006_complete specification_filed with IPO on 26 May 2015_markedup copy.pdf 2015-06-04
23 3353-CHE-2014-CLAIMS [26-05-2020(online)].pdf 2020-05-26
23 SRIB-20140513-006_drawings _filed with IPO on 26 May 2015.pdf 2015-06-04
24 3353-CHE-2014-Annexure [27-05-2020(online)].pdf 2020-05-27
24 SRIB-20140513-006_Form 13_filed with IPO on 26 May 2015.pdf 2015-06-04
25 3353-CHE-2014-Proof of Right [26-08-2021(online)].pdf 2021-08-26
25 3353-CHE-2014 FORM-13 26-05-2015.pdf 2015-05-26
26 SRIB-20140513-006_complete specification filed with IPO on 21 April 2015.pdf 2015-04-21
26 3353-CHE-2014-PETITION UNDER RULE 138 [26-08-2021(online)].pdf 2021-08-26
27 SRIB-20140513-006_drawings filed with IPO on 21 April 2015.pdf 2015-04-21
27 3353-CHE-2014-PETITION UNDER RULE 137 [27-08-2021(online)].pdf 2021-08-27
28 SRIB-20140513-006_drawings_PS_filed on 7 July 2014.pdf 2014-07-11
28 3353-CHE-2014-PatentCertificate31-08-2021.pdf 2021-08-31
29 SRIB-20140513-006_PS_filied with IPO on 7 June 2014.pdf 2014-07-11
29 3353-CHE-2014-IntimationOfGrant31-08-2021.pdf 2021-08-31

Search Strategy

1 3353CHE2014_27-11-2019.pdf

ERegister / Renewals

3rd: 29 Nov 2021

From 07/07/2016 - To 07/07/2017

4th: 29 Nov 2021

From 07/07/2017 - To 07/07/2018

5th: 29 Nov 2021

From 07/07/2018 - To 07/07/2019

6th: 29 Nov 2021

From 07/07/2019 - To 07/07/2020

7th: 29 Nov 2021

From 07/07/2020 - To 07/07/2021

8th: 29 Nov 2021

From 07/07/2021 - To 07/07/2022

9th: 14 Jun 2022

From 07/07/2022 - To 07/07/2023