Sign In to Follow Application
View All Documents & Correspondence

Method And System For Automating And Dynamically Testing Gesture Recognition Application

Abstract: ABSTRACT The invention provides a method for dynamically testing a gesture-enabled graphical user interface (GUI) application. The method includes creating gesture template(s) indicating gesture characteristic(s), receiving test script including the gesture characteristic, and retrieving the gesture template corresponding to the test script. Further, the method includes translating the retrieved gesture template into a gesture, triggering the gesture-enabled GUI application to respond in accordance to the gesture, and generating a test report based on the response in order to validate the gesture-enabled GUI application. FIG. 3

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
04 June 2013
Publication Number
26/2013
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
patent@brainleague.com
Parent Application

Applicants

HCL Technologies Limited
HCL Technologies Ltd. 50-53 Greams Road, Chennai – 600006, Tamil Nadu, India

Inventors

1. Rajesh Babu Suraparaju
HCL Technologies Ltd., Arihant Technopolis 4/293, Old Mahabalipuram Road Kandanchavadi, Chennai 600096.
2. Krishna Bharat Yedla
HCL Technologies Ltd., Arihant Technopolis 4/293, Old Mahabalipuram Road Kandanchavadi, Chennai 600096
3. Bala Arvind Ganesan
HCL Technologies Ltd., Arihant Technopolis 4/293, Old Mahabalipuram Road Kandanchavadi, Chennai 600096

Specification

CLIAMS:CLAIMS
What is claimed is:
1. A method for dynamically testing a gesture-enabled graphical user interface (GUI) application, the method comprising:
creating at least one gesture template indicating at least one gesture characteristic;
receiving at least one test script including said at least one gesture characteristic;
retrieving said at least one gesture template corresponding to said test script;
translating said at least one retrieved gesture template into at least one gesture;
triggering said gesture-enabled GUI application to respond in accordance to said at least one gesture; and
generating at least one test report based on said response to validate said gesture-enabled GUI application.
2. The method of claim 1, wherein said at least one gesture comprises at least one of deictic gesture, manipulative gesture, semaphore gesture, gesticulation gesture, language gesture, and multiple gesture style.
3. The method of claim 1, wherein said method further comprises storing said at least one gesture template.
4. The method of claim 1, wherein said method further comprises creating said at least one test script.
5. The method of claim 1, wherein said at least one test script defines said at least one gesture characteristic corresponding to said at least one gesture template.
6. The method of claim 1, wherein said method further comprises executing said at least one test script.
7. The method of claim 1, wherein said method further comprises parsing said at least one gesture characteristic to identify said at least one gesture.
8. The method of claim 1, wherein said method further comprises mapping said at least one test script with said at least one gesture template.
9. The method of claim 1, wherein said method further comprises performing said at least one gesture on said gesture-enabled GUI application.
10. The method of claim 1, wherein said method further comprises monitoring said response of said gesture-enabled GUI application in accordance to said at least one gesture.
11. A system for dynamically testing a gesture-enabled graphical user interface (GUI) application, the system configured to:
create at least one gesture template indicating at least one gesture characteristic,
receive at least one test script including said at least one gesture characteristic,
retrieve said at least one gesture template corresponding to said test script,
translate said at least one retrieved gesture template into at least one gesture,
trigger said gesture-enabled GUI application to respond in accordance to said at least one gesture, and
generate at least one test report based on said response to validate said gesture-enabled GUI application.
12. The system of claim 11, wherein said at least one gesture comprises at least one of deictic gesture, manipulative gesture, semaphore gesture, gesticulation gesture, language gesture, and multiple gesture style.
13. The system of claim 11, wherein said system further configured to store said at least one gesture template.
14. The system of claim 11, wherein said system further configured to create said at least one test script.
15. The system of claim 11, wherein said at least one test script defines said at least one gesture characteristic corresponding to said at least one gesture template.
16. The system of claim 11, wherein said system further configured to execute said at least one test script.
17. The system of claim 11, wherein said system further configured to parse said at least one gesture characteristic to identify said at least one gesture.
18. The system of claim 11, wherein said system further configured to map said at least one test script with said at least one gesture template.
19. The system of claim 11, wherein said system further configured to perform said at least one gesture on said gesture-enabled GUI application.
20. The system of claim 11, wherein said system further configured to monitor said response of said gesture-enabled GUI application in accordance to said at least one gesture.
21. A computer program product for dynamically testing a gesture-enabled graphical user interface (GUI) application, the product comprising:
an integrated circuit comprising at least one processor;
at least one memory having a computer program code within said circuit, wherein said at least one memory and said computer program code with said at least one processor cause said product to:
create at least one gesture template indicating at least one gesture characteristic,
receive at least one test script including said at least one gesture characteristic,
retrieve said at least one gesture template corresponding to said test script,
translate said at least one retrieved gesture template into at least one gesture,
trigger said gesture-enabled GUI application to respond in accordance to said at least one gesture, and
generate at least one test report based on said response to validate said gesture-enabled GUI application.

Dated: 04-06-2013
Signature:
Vikram Pratap SinghThakur
Patent Agent
,TagSPECI:FORM 2
The Patent Act 1970
(39 of 1970)
&
The Patent Rules, 2005

COMPLETE SPECIFICATION
(SEE SECTION 10 AND RULE 13)

TITLE OF THE INVENTION

Method and system for automating and dynamically testing gesture recognition application
APPLICANTS:
Name : HCL Technologies Limited
Nationality : Indian
Address : HCL Technologies Ltd., 50-53 Greams
Road,Chennai – 600006, Tamil Nadu, India

The following Specification particularly describes and ascertains the nature of this invention and the manner in which it is to be performed:
FIELD OF INVENTION
[001] The embodiments herein relate to gesture recognition application and more particularly, to a mechanism for dynamically testing a gesture-enabled graphical user interface (GUI) application.
BACKGROUND OF INVENTION
[002] People spend most of their time with their electronic devices such as for example, but not limited to, computers, mobile phones, music players, and the like. They generally like best those devices that are intuitive to use and interact. Almost all the electronic devices provide gesture-based user interfaces which allow the users to naturally interact and control the electronic devices. These gesture-based user interfaces typically employ gesture recognition software, which can include template-based recognizers or parametric-based recognizers. Testing such gesture-based user interfaces, particularly touch less gesture-based user interfaces, involves significant challenges.
[003] Many different methods and system are proposed to test the gesture-based user interface applications. To test the accuracy and performance of the gesture-based user interface application, the conventional systems and methods allows the user need to perform the gestures physically and test the application for accurate gesture recognition. Further, the use of such systems and methods may take a significant amount of time, effort, cost, and the like. Therefore, automating the process of testing the gesture-based user interface applications can eliminates the physical intervention of the user and significantly reduces the effort, time, and cost.
SUMMARY OF THE INVENTION
[004] Accordingly the invention provides a method for dynamically testing a gesture-enabled graphical user interface (GUI) application. The method includes creating gesture template(s) indicating gesture characteristic(s), receiving test script including the gesture characteristic, and retrieving the gesture template corresponding to the test script. Further, the method includes translating the retrieved gesture template into a gesture, triggering the gesture-enabled GUI application to respond in accordance to the gesture, and generating a test report based on the response to validate the gesture-enabled GUI application.
[005] In an embodiment, the gesture described herein include for example, but not limited to, deictic gesture, manipulative gesture, semaphore gesture, gesticulation gesture, language gesture, multiple gesture style, and the like. Furthermore, the method includes storing the gesture template. Furthermore, the method includes creating and executing the test script. In an embodiment, the test script defines the gesture characteristic corresponding to the gesture template. Furthermore, the method includes parsing the gesture characteristic to identify the gesture. Furthermore, the method includes mapping the test script with the gesture template. Furthermore, the method includes performing the gesture on the gesture-enabled GUI application. Furthermore, the method includes monitoring the response of the gesture-enabled GUI application in accordance to the gesture.
[006] Accordingly the invention provides a system for dynamically testing a gesture-enabled graphical user interface (GUI) application. The system is configured to create gesture template(s) indicating gesture characteristic(s), receive test script including the gesture characteristic, and retrieve the gesture template corresponding to the test script. Further, the system is configured to translate the retrieved gesture template into a gesture, trigger the gesture-enabled GUI application to respond in accordance to the gesture, and generate a test report based on the response to validate the gesture-enabled GUI application.
[007] Furthermore, the system is configured to store the gesture template. Furthermore, the system is configured to create and execute the test script. In an embodiment, the test script defines the gesture characteristic corresponding to the gesture template. Furthermore, the system is configured to parse the gesture characteristic to identify the gesture. Furthermore, the system is configured to map the test script with the gesture template. Furthermore, the system is configured to perform the gesture on the gesture-enabled GUI application. Furthermore, the system is configured to monitor the response of the gesture-enabled GUI application in accordance to the gesture.
[008] Accordingly the invention provides a computer program product for dynamically testing a gesture-enabled graphical user interface (GUI) application. The computer program product includes an integrated circuit. The integrated circuit includes a processor, a memory including a computer program code within the circuit. Further, the memory and the computer program code with the processor cause the product to create gesture template(s) indicating gesture characteristic(s), receive test script including the gesture characteristic, and retrieve the gesture template corresponding to the test script. Further, the memory and the computer program code with the processor cause the product to translate the retrieved gesture template into a gesture, trigger the gesture-enabled GUI application to respond in accordance to the gesture, and generate a test report based on the response to validate the gesture-enabled GUI application.
[009] These and other aspects of the embodiments herein will be better understood when considered in conjunction with the following description and the accompanying drawings. It should be understood, that the following descriptions, while indicating preferred embodiments and numerous specific details thereof, are given by way of illustration and not of limitation. Many changes and modifications may be made within the scope of the embodiments herein without departing from the spirit thereof, and the embodiments herein include all such modifications.
BRIEF DESCRIPTION OF THE FIGURES
[0010] The embodiments herein will be better understood from the following detailed description with reference to the drawings, in which:
[0011] FIG. 1 is a block diagram illustrating generally, among other things, a high level architecture of system, according to the embodiments disclosed herein;
[0012] FIG. 2 show an exemplary illustration of gesture template, according to the embodiments disclosed herein;
[0013] FIG. 3 is a diagram illustrating various operations performed by the system as described in the FIG. 1, according to the embodiments disclosed herein;
[0014] FIG. 4 is a flowchart illustrating a method for dynamically testing a gesture-enabled GUI application, according to the embodiments disclosed herein; and
[0015] FIG. 5 illustrates a computing environment implementing the method and system as disclosed in the embodiments herein.

DETAILED DESCRIPTION OF INVENTION
[0016] The embodiments herein and the various features and advantageous details thereof are explained more fully with reference to the non-limiting embodiments that are illustrated in the accompanying drawings and detailed in the following description. Descriptions of well-known components and processing techniques are omitted so as to not unnecessarily obscure the embodiments herein. The examples used herein are intended merely to facilitate an understanding of ways in which the embodiments herein may be practiced and to further enable those of skill in the art to practice the embodiments herein. Accordingly, the examples should not be construed as limiting the scope of the embodiments herein.
[0017] The embodiments herein disclose a method and system for dynamically testing a gesture-enabled graphical user interface (GUI) application. Unlike conventional systems, the present invention can be used to automatically test the gesture-enabled GUI application without any user intervention. In an embodiment, the one or more gesture templates indicating one or more gesture characteristics can be created and stored in a gesture template database. The system includes a test engine module configured to execute one or more test scripts and corresponding gesture templates, such as to test/validate the gesture-enabled GUI application. The test script described herein can include the one or more gesture characteristics corresponding to the gesture templates. Further, the system can include a gesture emulation module configured to parse and translate the one or more gesture characteristics into a meaningful gesture. In an embodiment, the meaningful gestures described herein can include for example, but not limited to, deictic gesture, manipulative gesture, semaphore gesture, gesticulation gesture, language gesture, multiple gesture style, or any other gesture. The gesture emulation module can be configured to trigger the gesture-enabled GUI application to respond in accordance to the gesture. The test engine module can be configured to monitor the gesture-enabled GUI application and generate a test report based on the response of the gesture-enabled application.
[0018] The proposed system and method is simple, robust, dynamic, inexpensive, and reliable for dynamically testing the gesture-enabled GUI application. Unlike conventional systems, instead of the user physically/ manually performing the gesture, the gesture templates are used and auto triggered to validate the gesture-enabled GUI application. The gesture can be trigged in a way that as if the user performs the gesture. The system and method can be used to automate the gesture-enabled GUI application testing process, particularly touch less gesture-enabled GUI application where the input is provided physically by the user or any other object. The use of such automated systems and methods can eliminate the physical intervention of the user, reduces the user effort, saves user time, cost, and makes the testing process easier. Furthermore, the proposed system and method can be implemented on the existing infrastructure and may not require extensive set-up or instrumentation.
[0019] FIG. 1 is a block diagram illustrating generally, among other things, a high level architecture 100 of a system 102, according to the embodiments disclosed herein. In an embodiment, the system 102 can be configured to include a gesture recognition module 104, a gesture recognition API module 106, a test engine module 108, a gesture emulation module 110, and a gesture-enabled GUI module 112. In an embodiment, the system 102 described herein can be for example but not limited to, an electronic device, smart phone, tablet, communicator, personal digital assistance (PDA), computer, automatic teller machine (ATM), consumer electronic devices, or any other system including gesture-based GUI applications to interact and control the system.
[0020] In an embodiment, the gesture recognition module 104 can be configured to recognize one or more gestures (here after referred as gesture(s)) performed by a user. In an embodiment, the gesture described herein can include for example, but not limited to, deictic gesture, manipulative gesture, semaphore gesture, gesticulation gesture, language gesture, multiple gesture style, and the like. In an embodiment, the deictic gestures described herein can include pointing an object virtually in the system 102 to establish the integration. In an embodiment, the manipulative gestures described herein can include controlling the system 102 by using external support (For example, stylus, mouse, hand, and the like) to perform actual movements on an object. This category includes touch-based (such as single-touch, multi-touch, and the like) gestures used for interacting with the objects. The manipulative gestures can be used to perform both 2-dimentional and 3-dimentional interaction including empty handed movements to mimic manipulations of the objects as in virtual reality interfaces.
[0021] In an embodiment, the semaphore gestures described herein can include using predefined signs, flags, arms, or strokes on the system 102 for performing the gesture. The semaphore gestures can include static poses or dynamic movements. For example, when a user joins thumb and forefinger together to represent ”ok” symbol then this gesture represents a static pose while moving hand in a waving motion is a dynamic semaphore gesture. Further, the semaphore gestures can refer to strokes or marks made with the mouse, stylus, or virtually thorough eyes, gazing, and the like which can be mapped onto various interface commands. For example, mouse strokes performed to control navigation (back and forward) of a web browser. In an embodiment, the gesticulation gestures described herein can be combined with speech and may not require the user to perform any poses or to learn any gestures other than those that naturally accompany everyday speech. The gesticulations can include depictive or iconic gestures that may be used to clarify a verbal description of an object (for example, physical shape or form) through the use of the gestures that shape. In an embodiment, the language gestures described herein can include sign language like semaphore gesture, along with spoken or written commands in a particular language. In an embodiment, the multiple gesture style described herein can include two or more gestures such as described above.
[0022] In an embodiment, the gesture recognition module 104 can be configured to include sufficient hardware and firmware to recognize the gestures. The system 102 can include or integrate with any gesture recognizing devices like Kinect, Leap Motion, 2D/3D Depth sensors, and the like, and which may the capability to detect gestures ranging from face, hands, eyes, to full body of the user.
[0023] In an embodiment, the gesture recognition API module 106 can be configured to gather and process the gesture data returned by the gesture recognition module 104. The gesture recognition API module 106 can be configured to transform the gesture data into a meaningful gesture. The gesture recognition API module 106 can be configured to include sufficient adapters to handle the data returned by the gesture recognition module 104.
[0024] In an embodiment, the test engine module 108 can be configured to execute/ run the test scripts to test/validate gesture-enabled GUI application output. The test script described herein can include the one or more gesture characteristics corresponding to the gesture templates. In an embodiment, the gesture characteristics described herein can include for example, but not limited to, gesture name, gesture type, gesture threshold, gesture velocity, gesture intensity, and the like. In an embodiment, the test module 108 can be configured to store the test scripts in a test script database 114. In an embodiment, the test module 108 can be configured to test the entire gesture-enabled GUI application individually or can integrate or communicate with any third-party testing suite engines 116. In an embodiment, the test module 108 can be configured to include sufficient interfaces to communicate with the third-party testing suite engines 116.
[0025] In an embodiment, the gesture emulation module 110 can be configured to emulate the gesture in a way that it appears as if a user is performing the gesture in front of the system 102. In an embodiment, the gesture emulation module 110 can be configured to communicate with a gesture template database 118. The gesture template database 118 can be configured to store the gesture templates including the one or more gesture characteristics. The gesture templates described herein can be standard gesture templates where standard gestures such as swipe right, swipe left, hovering, and the like. Further, an exemplary illustration of gesture template is described in conjunction with the FIG. 2. The gesture emulation module 110 can be configured to include two components namely, parser 120 and translator 122 respectively. In an embodiment, the parser 120 can be configured to parse the gesture characteristics and signal the translator 122 to translate the gesture characteristics into device specific data and internally sends to the gesture recognition device for detection. In an embodiment, gesture emulation module 110 can be configured to send the translated gesture to the gesture-enabled GUI module 112. In an embodiment, the gesture-enabled GUI module 112 can be configured to include gesture-enabled GUI application 124 which recognize and responds in accordance to the received gesture.
[0026] The various modules and labels shown with respect to the FIG. 1 are only for illustrative purpose and do not limit the scope of the invention. Further, in real-time, one or more modules can be added, deleted, integrated, and modified in any form, without departing from the scope of the invention. Furthermore, various operations performed by the system 102 are described in conjunction with the FIG. 3.
[0027] FIG. 2 show an exemplary illustration of test template 200, according to the embodiments disclosed herein. In an embodiment, the test template 200 can be configured to include gesture characteristics 202. In an embodiment, the gesture characteristics described herein can include for example, but not limited to, gesture name, gesture type, gesture threshold, gesture velocity, gesture intensity, and the like. In an embodiment, the gesture template database 118 can be configured to store the gesture template 200 in any format. For example, as shown in the FIG. 2, the gesture template database 118 stores the gesture template in extensible markup language (XML) format. The gesture template 200 shown in the FIG. 2 is only for illustrative purpose and do not limit the scope of the invention.
[0028] FIG. 3 is a diagram illustrating various operations 300 performed by the system 102 as described in the FIG. 1, according to the embodiments disclosed herein. In an embodiment, at 302, the system 102 allows a tester or any other user to create the gesture templates. The gesture templates described herein can be standard gesture templates where standard gestures such as swipe right, swipe left, hovering, and the like. In an embodiment, the system 102 can allow the tester to define gesture characteristics, such as to create any type of gesture. The gesture characteristics described herein can include for example, but not limited to, gesture name, gesture type, gesture threshold, gesture velocity, gesture intensity, and the like. In an embodiment, the gesture templates created by the tester can be stored in the gesture template database 118.
[0029] In an embodiment, at 304, the test engine module 108 can be configured to receive the test scripts to test/validate gesture-enabled GUI application. The test script described herein can include the one or more gesture characteristics corresponding to the gesture templates. In an embodiment, test engine module 108 can receive the test script from the test script database 114. In an embodiment, the test engine module 108 can be configured to integrate with any third-party testing suite engines 116, such as to receive the test script.
[0030] In an embodiment, at 306, the test engine module 108 can be configured to retrieve the gesture templates corresponding to the gesture characteristics associated with the test script. The test engine module 108 configured to map the test script with the gesture templates and communicate with the gesture template database 118 to retrieve the gesture templates corresponding to the gesture characteristics associated with the test script. In an embodiment, at 308, the test engine module 108 can be configured to send the gesture templates corresponding to the gesture characteristics associated with the test script to the gesture emulation module 110. In an embodiment, the gesture emulation module 110, in communication with the parser 120, can be configured to parse the gesture template. In an embodiment, the gesture emulation module 110, in communication with the translator 122, can be configured to translate the gesture template into a meaningful gesture.
[0031] In an embodiment, at 310, the gesture emulation module 110 can be configured to trigger the gesture to the gesture-enabled GUI module 112 to respond in accordance to the gesture. The gesture emulation module 110 triggers the gesture in a way as if the gesture is performed by a user. In an embodiment, the gesture-enabled GUI module 112, in communication with the gesture-enabled GUI application, can be configured to respond in accordance to the gesture. In an embodiment, at 312, the test engine module 108 can be configured to monitor the response or behavior of the gesture-enabled GUI with respect to the received gesture. In an embodiment, the test engine module 108 can be configured to provide the response information to the third-party test engine 116. In an embodiment, at 314, the test engine module 108 can be configured to generate a test report including the response performed by the gesture-enabled GUI application.
[0032] FIG. 4 is a flowchart illustrating a method 400 for dynamically testing a gesture-enabled GUI application, according to the embodiments disclosed herein. The method 400 and other description described herein provide a basis for a control program which can be implemented using a microcontroller, a microprocessor, or a combination thereof. In an embodiment, at step 402, the method 400 includes creating the gesture templates. The gesture templates described herein can be standard gesture templates where standard gestures such as swipe right, swipe left, hovering, and the like. In an example, the method 400 allows tester or any other user to create the gesture templates. The tester can define gesture characteristics for example, but not limited to, gesture name, gesture type, gesture threshold, gesture velocity, gesture intensity, and the like, such as to create any type of gesture. The method 400 allows the gesture template database 118 to store the gesture templates created by the tester.
[0033] In an embodiment, at step 404, the method 400 includes running or executing the test script including the gesture characteristics, such as to test/validate gesture-enabled GUI application. The test script described herein can include the one or more gesture characteristics corresponding to the gesture templates. In an example, the method 400 allows the test engine module 108 to receive the test script from the test script database 114. In another example, the method 400 allows the test engine module 108 to integrate with any third-party testing suite engines 116, such as to receive the test script.
[0034] In an embodiment, at step 406, the method 400 includes retrieving gesture template corresponding to the test script. In an example, the method 400 allows the test engine module 108 to map the test script with the gesture templates and retrieve the gesture templates corresponding to the gesture characteristics associated with the test script. The test engine module 108 communicates with the gesture template database 118 to retrieve the gesture templates corresponding to the gesture characteristics associated with the test script. In an embodiment, the method 400 allows the test engine module 108 to send the retrieved gesture template to the gesture emulation module 110. In an embodiment, at step 408, the method 400 includes parsing the retrieved gesture templates. In an example, the method 400 allows the parser 120 to parse the gesture template. In an embodiment, at step 410, the method 400 includes translating the retrieved gesture templates into a meaningful gesture. In an example, the method 400 allows the translator 122 to translate the gesture template into the meaningful gesture.
[0035] In an embodiment, at step 412, the method 400 includes triggering the he gesture to the gesture-enabled GUI module 112 to respond in accordance to the gesture. In an example, the method 400 allows the gesture emulation module 110 to trigger the gesture in a way as if the gesture is performed by a user. The method 400 allows the gesture-enabled GUI application to respond in accordance to the gesture. In an embodiment, at step 414, the method 400 includes generating a test report including the response performed by the gesture-enabled GUI application. In an example, the method allows the test engine module 108 to monitor and generate the test report including the response or behavior of the gesture-enabled GUI with respect to the received gesture.
[0036] The various actions, steps, blocks, or acts described with respect to the FIGS. 3 and 4 can be performed in sequential order, in random order, simultaneously, parallel, or a combination thereof. Further, in some embodiments, some of the steps, blocks, or acts can be omitted, skipped, modified, or added without departing from the scope of the invention.
[0037] FIG. 5 illustrates a computing environment 502 implementing the method and systems as disclosed in the embodiments herein. As depicted the computing environment 502 comprises at least one processing unit 504 that is equipped with a control unit 506 and an Arithmetic Logic Unit (ALU) 508, a memory 510, a storage unit 512, plurality of networking devices 514 and a plurality Input output (I/O) devices 516. The processing unit 504 is responsible for processing the instructions of the algorithm. The processing unit 504 receives commands from the control unit 506 in order to perform its processing. Further, any logical and arithmetic operations involved in the execution of the instructions are computed with the help of the ALU 508.
[0038] The overall computing environment 502 can be composed of multiple homogeneous and/or heterogeneous cores, multiple CPUs of different kinds, special media and other accelerators. The processing unit 504 is responsible for processing the instructions of the algorithm. Further, the plurality of processing units 504 may be located on a single chip or over multiple chips.
[0039] The algorithm comprising of instructions and codes required for the implementation are stored in either the memory unit 510 or the storage 512 or both. At the time of execution, the instructions may be fetched from the corresponding memory 510 and/or storage 512, and executed by the processing unit 504. In case of any hardware implementations various networking devices 514 or external I/O devices 516 may be connected to the computing environment to support the implementation through the networking unit and the I/O device unit.
[0040] The embodiments disclosed herein can be implemented through at least one software program running on at least one hardware device and performing network management functions to control the elements. The elements shown in FIGS. 1 through 5 include blocks, steps, operations, and acts, which can be at least one of a hardware device, or a combination of hardware device and software module.
[0041] The foregoing description of the specific embodiments will so fully reveal the general nature of the embodiments herein that others can, by applying current knowledge, readily modify and/or adapt for various applications such specific embodiments without departing from the generic concept, and, therefore, such adaptations and modifications should and are intended to be comprehended within the meaning and range of equivalents of the disclosed embodiments. It is to be understood that the phraseology or terminology employed herein is for the purpose of description and not of limitation. Therefore, while the embodiments herein have been described in terms of preferred embodiments, those skilled in the art will recognize that the embodiments herein can be practiced with modification within the spirit and scope of the embodiments as described herein.

Documents

Application Documents

# Name Date
1 2447-CHE-2013 FORM-9 05-06-2013.pdf 2013-06-05
1 2447-CHE-2013-AbandonedLetter.pdf 2019-11-15
2 2447-CHE-2013-FER.pdf 2019-05-13
2 2447-CHE-2013 FORM-18 05-06-2013.pdf 2013-06-05
3 Form 5.pdf 2013-06-12
3 2447-CHE-2013 FORM-1 25-06-2013.pdf 2013-06-25
4 FORM 3.pdf 2013-06-12
4 2447-CHE-2013 POWER OF ATTORNEY 25-06-2013.pdf 2013-06-25
5 2447-CHE-2013 CORRESPONDENCE OTHERS 25-06-2013.pdf 2013-06-25
5 FORM 2.pdf 2013-06-12
6 abstract2447-CHE-2013.jpg 2013-06-19
6 Drawings.pdf 2013-06-12
7 abstract2447-CHE-2013.jpg 2013-06-19
7 Drawings.pdf 2013-06-12
8 2447-CHE-2013 CORRESPONDENCE OTHERS 25-06-2013.pdf 2013-06-25
8 FORM 2.pdf 2013-06-12
9 2447-CHE-2013 POWER OF ATTORNEY 25-06-2013.pdf 2013-06-25
9 FORM 3.pdf 2013-06-12
10 Form 5.pdf 2013-06-12
10 2447-CHE-2013 FORM-1 25-06-2013.pdf 2013-06-25
11 2447-CHE-2013-FER.pdf 2019-05-13
11 2447-CHE-2013 FORM-18 05-06-2013.pdf 2013-06-05
12 2447-CHE-2013-AbandonedLetter.pdf 2019-11-15
12 2447-CHE-2013 FORM-9 05-06-2013.pdf 2013-06-05

Search Strategy

1 2447che2013_09-05-2019.pdf