Abstract: Disclosed is a system and method fordata logging of a device(104). The system comprises a capacitive touch sensor (106), a camera (110), and a data logging hardware (102). The capacitive touch sensor (106) receives inputs of a user. The camera (110) records each action performed by the device (104) in response to each input. The data logging hardware (102) extracts a plurality of feature vectors from an output of the camera corresponding to each action in order to identify a current state of the device. A neuromorphic hardware (224) identifies the current state of the device by comparing each feature vector from the plurality of feature vectors with a predefined knowledge of feature vectors. Further, a data log of the device (104) is generated based on the series of inputs, actions performed by the device and the current state of the device (104).[To be published with Figure 1]
PRIORITY INFORMATION
[001] This patent application does not claim priority from any application.
TECHNICAL FIELD
[002] The present subject matter described herein, in general, relates to data logging of a device and more particularly to facilitating a non-invasive method of data logging of the device.
BACKGROUND
[003] Typically, a data log is generated to test each functionality of a device. The data log is a log file comprising information of inputs, outputs and performance of the device upon receipt of the inputs. Traditionally, a Subject Matter Expert (SME) may analyze the log file to identify any defects in working of the device. In order to generate the data log, current systems and methodologies embed a data logging firmware inside the device to capture actions performed by the device. However, the data logging firmware may degrade the performance of the device. It is to be noted that the data logging firmware is limited to coded functionality added in the device. In addition, the log file may be retrieved either during a maintenance or service of the device contributing in delay of feedback/improvement of the device.
SUMMARY
[004] Before the present systems and methods for facilitating data logging of a device are described, it is to be understood that this application is not limited to the particular systems, and methodologies described, as there can be multiple possible embodiments which are not expressly illustrated in the present disclosure. It is also to be understood that the terminology used in the description is for the purpose of describing the particular versions or embodiments only, and is not intended to limit the scope of the present application. This summary is provided to introduce concepts related to systems and methods for facilitating data logging of a device and the concepts are further described below in the detailed description. This summary is not intended to identify essential features of the claimed subject matter nor is it intended for use in determining or limiting the scope of the claimed subject matter.
[005] In one implementation, a method for facilitating data logging of a device is disclosed. In order to generate a data log, initially, inputs of a user may be received with a device using a capacitive touch sensor mounted on each input key of the device. Upon receiving the inputs, each action performed by the device may be recorded in response to each input by using a camera mounted to capture the device. After recording, a plurality of feature vectors may be identified from an output of the camera corresponding to each action in order to identify a current state of the device. Further, a set of unique feature vectors corresponding to the series of inputs may be identified by comparing each feature vector from the plurality of feature vectors with a predefined knowledge of feature vectors. In one aspect, the set of unique feature vectors may be identified by a neuromorphic hardware. It is to be noted that the set of unique feature vectors indicates the current state of the device. Furthermore, a data log of the device may be generated based on the series of inputs, actions performed by the device and the current state of the device. In one aspect, the aforementioned method for facilitating data logging of the device may be performed by a processor using programmed instructions stored in a memory.
[006] In another implementation, a system for facilitating data logging of a device is disclosed. The system may comprise a capacitive touch sensor, a camera, and a data logging hardware. The data logging hardware may comprise a neuromorphic hardware, a processor and a memory coupled to the processor. The processor may execute a set of instructions stored in the memory. The neuromorphic hardware may be coupled to the processor and the memory to execute instructions. Initially, the capacitive touch sensor mounted on each input key of a device may receive inputs of a user. Further the camera may record each action performed by the device in response to each input. Subsequently, the data logging hardware may extract a plurality of feature vectors from an output of the camera corresponding to each action in order to identify a current state of the device. Furthermore, the neuromorphic hardware may identify a set of unique feature vectors corresponding to the series of inputs by comparing each feature vector from the plurality of feature vectors with a predefined knowledge of feature vectors. In one aspect, the set of unique feature vectors may indicate the current state of the device. Upon identifying the current state of the device, the data logging hardware may generate a data log of the device based on the series of inputs, actions performed by the device and the current state of the device.
[007] In yet another implementation, non-transitory computer readable medium embodying a program executable in a computing device for facilitating data logging of a device is disclosed. The program may comprise a program code for receiving inputs of a user with a device using a capacitive touch sensor mounted on each input key of the device. Further, the program may comprise a program code for recording each action performed by the device in response to each input by using a camera mounted to capture the device. The program may comprise a program code for extracting a plurality of feature vectors from an output of the camera corresponding to each action in order to identify a current state of the device. The program may comprise a program code for identifying a set of unique feature vectors corresponding to the series of inputs by comparing each feature vector from the plurality of feature vectors with a predefined knowledge of feature vectors. In one aspect, the set of unique feature vectors indicates the current state of the device. The program may comprise a program code for generating a data log of the device based on the series of inputs, actions performed by the device and the current state of the device.
BRIEF DESCRIPTION OF THE DRAWINGS
[008] The foregoing detailed description of embodiments is better understood when read in conjunction with the appended drawings. For the purpose of illustrating the disclosure, example constructions of the disclosure are shown in the present document; however, the disclosure is not limited to the specific methods and apparatus for facilitating data logging of a device as disclosed in the document and the drawings.
[009] The detailed description is given with reference to the accompanying figures. In the figures, the left-most digit(s) of a reference number identifies the figure in which the reference number first appears. The same numbers are used throughout the drawings to refer like features and components.
[0010] Figure 1 illustrates a network implementation of a system for facilitating data logging of a device, in accordance with an embodiment of the present subject matter.
[0011] Figure 2 illustrates the system, in accordance with an embodiment of the present subject matter.
[0012] Figure 3 illustrates a method for facilitating data logging of a device, in accordance with an embodiment of the present subject matter.
[0013] Figure 4 and Figure 5 illustrate exemplary implementations of the system, in accordance with an embodiment of the present subject matter.
DETAILED DESCRIPTION
[0014] Some embodiments of this disclosure, illustrating all its features, will now be discussed in detail. The words "receiving," "recording," "extracting," "identifying," and "generating,” and other forms thereof, are intended to be open ended in that an item or items following any one of these words is not meant to be an exhaustive listing of such item or items, or meant to be limited to only the listed item or items. It must also be noted that as used herein and in the appended claims, the singular forms "a," "an," and "the" include plural references unless the context clearly dictates otherwise. Although any systems and methods similar or equivalent to those described herein can be used in the practice or testing of embodiments of the present disclosure, the exemplary, systems and methods for facilitating data logging of a device are now described. The disclosed embodiments are merely exemplary of the disclosure, which may be embodied in various forms.
[0015] Various modifications to the embodiment will be readily apparent to those skilled in the art and the generic principles herein may be applied to other embodiments. However, one of ordinary skill in the art will readily recognize that the present disclosure is not intended to be limited to the embodiments illustrated, but is to be accorded the widest scope consistent with the principles and features described herein.
[0016] The present invention discloses a system and method for facilitating data logging of a device. Example of the device may include, but not limited to, a mechanical device, an electrical device, and an electro-mechanical device. It is to be noted that the device may not have a memory and a processor to execute programmed instructions. Thus, there exists a challenge in generating a data log for the device. The present system utilizes a camera, a capacitive touch sensor, and a data logging hardware to facilitate data logging of the device. The capacitive touch sensor may be mounted on each key of the device. The capacitive touch sensor may return coordinates of the key to the data logging hardware when each key is pressed by a user. In one implementation, a robotic arm may be used to press the key. In another implementation, a touch screen with the capacitive touch sensor may be mounted on the device.
[0017] The camera may be mounted facing the device. The camera may record each action performed by the device in response to the inputs received. Each action may be recorded in a digital media format. Example of the digital media format may include, but not limited to, a video, a GIF, and an image.
[0018] The data logging hardware may receive the coordinates and recorded action corresponding to the inputs. The data logging hardware may comprise a neuromorphic hardware to identify current state of the device from the feature extracted out of the camera output by the processor. The processor may utilize image processing techniques to extract the plurality of the features. It is to be noted that the output of the camera is an image, a video, a GIF and others. Further, the neuromorphic hardware identifies a set of unique features corresponding to series of inputs along with the current identified state. In one embodiment, once the current state is identified, the neuromorphic hardware may compare the current state of the device along with the inputs with a predefined knowledge of state of the device corresponding to each input.
[0019] In one embodiment, when the current state of device is not found in the predefined knowledge of the state of the device, the neuromorphic hardware assigns a weight to the current state of the device. The predefined knowledge may be updated with the current state for future scenarios. While aspects of described system and method for facilitating data logging of the device and may be implemented in any number of different computing systems, environments, and/or configurations, the embodiments are described in the context of the following exemplary system.
[0020] Referring now to Figure 1, a network implementation 100 of a system for facilitating data logging of a device 104 is disclosed. The system comprises a capacitive touch sensor 106, a camera 110, and a data logging hardware 102. Initially, the capacitive touch sensor 106 mounted on each input key of the device 104 may receive inputs of a user. Further the camera 110 may record each action performed by the device 104 in response to each input. Subsequently, the data logging hardware 102 may extract a plurality of feature vectors from an output of the camera 110 corresponding to each action in order to identify a current state of the device. The current state of the device is identified by the neuromorphic hardware present in the data logging hardware 102. Furthermore, a neuromorphic hardware may identify the state of the device with the action state and using the various input to the device with a predefined knowledge of feature vectors. In one aspect, the set of unique feature vectors may indicate the current state of the device. Upon identifying the current state of the device, the data logging hardware 102 may generate a data log of the device based on the series of inputs, actions performed by the device and the current state of the device.
[0021] Although the present disclosure is explained considering that the system comprising the data logging hardware 102 for facilitating data logging of a device 104 is implemented on a server, it may be understood that the data logging hardware 102 may be implemented in a variety of computing systems, such as a laptop computer, a desktop computer, a notebook, a workstation, a mainframe computer, a server, a network server, a cloud-based computing environment. It will be understood that the data logging hardware 102 may be used by multiple users through one or more user devices 112-1, 112-2…112-N, collectively referred to as user 112 or stakeholders, hereinafter, or applications residing on the user devices 112 for facilitating data logging. In one implementation, the data logging hardware 102 may comprise the cloud-based computing environment in which a user may operate individual computing systems configured to execute remotely located applications. Examples of the user devices 112 may include, but are not limited to, a portable computer, a personal digital assistant, a handheld device, and a workstation. The user devices 112 are communicatively coupled to the data logging hardware 102 through a network 108.
[0022] In one implementation, the network 108 may be a wireless network, a wired network or a combination thereof. The network 108 can be implemented as one of the different types of networks, such as intranet, local area network (LAN), wide area network (WAN), the internet, and the like. The network 108 may either be a dedicated network or a shared network. The shared network represents an association of the different types of networks that use a variety of protocols, for example, Hypertext Transfer Protocol (HTTP), Transmission Control Protocol/Internet Protocol (TCP/IP), Wireless Application Protocol (WAP), and the like, to communicate with one another. Further, the network 108 may include a variety of network devices, including routers, bridges, servers, computing devices, storage devices, and the like.
[0023] Referring now to Figure 2, a data logging hardware 102 for facilitating data logging of a device is illustrated in accordance with an embodiment of the present subject matter. In one embodiment, the data logging hardware 102 may include at least one processor 202, an input/output (I/O) interface 204, and a memory 206. The at least one processor 202 may be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, state machines, logic circuitries, and/or any devices that manipulate signals based on operational instructions. Among other capabilities, the at least one processor 202 is configured to fetch and execute computer-readable instructions stored in the memory 206.
[0024] The I/O interface 204 may include a variety of software and hardware interfaces, for example, a web interface, a graphical user interface, and the like. The I/O interface 204 may allow the data logging hardware 102 for facilitating data logging to interact with the user directly or through the client devices 112. Further, the I/O interface 204 may enable the data logging hardware 102 to communicate with other computing devices, such as web servers and external data servers (not shown). The I/O interface 204 can facilitate multiple communications within a wide variety of networks and protocol types, including wired networks, for example, LAN, cable, etc., and wireless networks, such as WLAN, cellular, or satellite. The I/O interface 204 may include one or more ports for connecting a number of devices to one another or to another server. The I/O interface may receive the inputs from the touch sensor 106 through any communication interfaces like serial, I2C, USB, SPI or other communication protocol.
[0025] The memory 206 may include any computer-readable medium or computer program product known in the art including, for example, volatile memory, such as static random access memory (SRAM) and dynamic random access memory (DRAM), and/or non-volatile memory, such as read only memory (ROM), erasable programmable ROM, flash memories, hard disks, optical disks, and magnetic tapes. The memory 206 may include modules 208 and data 210.
[0026] The modules 208 include routines, programs, objects, components, data structures, etc., which perform particular tasks or implement particular abstract data types. In one implementation, the modules 208 may include an extraction module 212, an identification module 214, a generation module 216, and other modules 218. The other modules 218 may include programs or coded instructions that supplement applications and functions of the data logging hardware 102 for facilitating data logging. The modules 208 described herein may be implemented as software modules that may be executed in the cloud-based computing environment of the data logging hardware 102.
[0027] The data 210, amongst other things, serves as a repository for storing data processed, received, and generated by one or more of the modules 208. The data 210 may also include a system database 220 and other data 222. The other data 222 may include data generated as a result of the execution of one or more modules in the other modules 218.
[0028] As there are various challenges observed in the existing art, the challenges necessitate the need to build the data logging hardware 102 for facilitating data logging. In order to facilitate data logging, at first, a user may use the device 104 to access the data logging hardware 102 via the I/O interface 204. The user may register them using the I/O interface 204 in order to use the data logging hardware 102. In one aspect, the user may access the I/O interface 204 of the data logging hardware 102. The data logging hardware 102 may employ the extraction module 212, the identification module 214, and the generation module 216. The detail functioning of the modules is described below with the help of figures.
[0029] The present system facilitates data logging of a device 104. The system comprises a camera 110, a capacitive touch sensor 106, and a data logging hardware 102. It is to be understood that the device 104 may be a mechanical device, an electrical device, an electromechanical device and others. In one implementation, the device 104 may have a memory and a processor to execute programmed instructions stored in the memory. In other implementation, the device 104 may not have the memory and the processor. In order to generate a data log of the device 104, initially, a user may interact with the device 104. It is to be noted that the device 104 is capable to receive user inputs and provide an output. In one aspect, the output may be displayed on a Graphical User Interface (GUI) of the device 104. In another aspect, the output may be a result of a process performed by the device upon receipt of the user inputs.
[0030] The device 104 may have a capacitive touch sensor 106 mounted on each input key of the device 104. The capacitive touch sensor 106 is configured to return coordinates of each key being pressed by the user while providing inputs to the device 104. In one aspect, the inputs may be received from a robotic arm, a stencil, touch screen, a mouse, and alike. In another aspect, the touch screen may have the capacitive touch sensor 106 installed beneath the touch screen. The capacitive touch sensor 106 may return the coordinates of one or more touch points entered by the user. Similarly, if the inputs are provided by using the robotic arm, the capacitive touch sensor 106 may return the coordinates of each key pressed by the robotic arm.
[0031] Once the inputs are received, the camera 110, facing towards the device 104, is configured to record each action performed by the device 104 in response to each input. In one implementation, the action may be recorded in at least one of an image format, a GIF format, and a video format. In one aspect, the camera 110 may have a memory and a processor to execute stored instructions. The memory may be used to store each action recorded by the camera 110. The processor may be configured to communicate each action to the data logging hardware 102. In another aspect, the camera 110 may be connected to the data logging hardware 102 via a USB cable or a data transfer cable. It must be noted that the camera 110 may be configured to capture a sequence of actions performed by the device 104 upon receipt of a sequence of inputs.
[0032] Once each action is recorded, the extraction module 212 extracts a plurality of feature vectors from an output of the camera corresponding to each action in order to identify a current state of the device. The processor 202 using various image processing techniques to extract the plurality of features from the actions of the device. Once the plurality of features is extracted, the data logging hardware 102 engages the neuromorphic hardware 224 for identifying the current state of the device. The neuromorphic hardware 224 further executes the identification module 214 for identifying a set of unique feature vectors corresponding to the series of inputs by comparing each feature vector from the plurality of feature vectors with a predefined knowledge of feature vectors along with the identified state of the device for the action. It is to be noted that the set of unique feature vectors indicates the current state of the device.
[0033] The neuromorphic hardware 224 may identify the current state using machine learning techniques. In one embodiment, the neuromorphic hardware 224 assigns a weight to current state of the device based on a predefined list comprising a mapping of each state of the device with a predefined weight. It is to be noted that the weight is assigned to the current state of the device when the current state is not matched with at least one state of the device as stored in the system database 220. The current state along with the series of inputs and action performed by the device may create a neuron. The neuromorphic hardware stores the neuron for each current state of the device in the system database 220. The predefined list may be stored in the system database 220 as a knowledge base. The predefined weight may be stored in the system database 220. It must be noted that not each action is stored or assigned the weight but only the actions with differences are assigned the weight. In other words, the neurons corresponding to differences may be stored to reduce memory consumption and processing overhead.
[0034] Once the current state of the device is matched with the predefined knowledge of the feature vectors, the generation module 216 generates a data log of the device 104. The data log is generated based on the series of inputs, actions performed by the device and the current state of the device. The data log indicates each action performed by the device 104 corresponding to the inputs or the series of input provided by the user. The data log may be stored in the system database 220. The data log may enable the system to identify a defect in the device 104. In one aspect, the data log may be a text file or a presentation having screen recordings of the GUI. In another aspect, if the device 104 does not have the GUI then the data log may not have video clips or images of the device 104. In one implementation, the data log may be utilized for performing testing of the device 104.
[0035] It is to be noted that the generating module 216 generates the data log in a non-invasive manner. In other words, the system facilitates generation of the data log by only interacting with interface of the device and not with functional blocks of the device. Thus, the system is capable to generate the data log for any device irrespective of a functional configuration of the device.
[0036] Referring now to Figure 4, an exemplary implementation 400 of the system is disclosed in accordance with the embodiment of the subject matter. It is to be noted that the device 104 does not comprise a display screen nor the actions performed by the device 104 is recorded by a camera. During operation, when the inputs is received, the capacitive touch sensor 106 may provide the coordinates of the key press to the data logging hardware 102. Upon receipt of the input, the data logging hardware 102 identifies the current state of the device by comparing actions performed by the device with a predefined list of actions corresponding to the input provided by the user.
[0037] Referring now to Figure 5, an exemplary implementation 500 of the system is disclosed in accordance with the embodiment of the subject matter. Considering the device 104 comprises the display 114 to display output of the device. A robotic arm 502 is configured to provide inputs to the device 104. It is to be noted that the camera 110 records location of a pointer of the robotic arm when the pointer interacts with the device 102. Further, once the robotic arm 502 moves from one location to another, the camera 110 records another location and transmits corresponding coordinates to the data logging hardware 102. Furthermore, the data logging hardware 102 computes the distance between different location of the pointer to identify the touch point. Subsequently, the neuromorphic hardware present in the data logging hardware 102 compares the current state captured by the camera 110 with the predefined knowledge of the state of the device 104. If the current state is not matched with the predefined knowledge of the state, the neuromorphic hardware may store the current state as a neuron in the system database. It is to be noted that the neuron may comprise the current screen, series of inputs, and actions performed by the device 104. Furthermore, the data logging hardware 104 may generate a data log based on the neuron to facilitate testing of the device 104.
[0038] Referring now to Figure 3, a method 300 for facilitating data logging of a device is shown, in accordance with an embodiment of the present subject matter. The method 300 may be described in the general context of computer executable instructions. Generally, computer executable instructions can include routines, programs, objects, components, data structures, procedures, modules, functions, etc., that perform particular functions or implement particular abstract data types. The method 300 may also be practiced in a distributed computing environment where functions are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, computer executable instructions may be located in both local and remote computer storage media, including memory storage devices.
[0039] The order in which the method 300 for facilitating data logging of a device is described is not intended to be construed as a limitation, and any number of the described method blocks can be combined in any order to implement the method 300 or alternate methods. Additionally, individual blocks may be deleted from the method 300 without departing from the spirit and scope of the subject matter described herein. Furthermore, the method 300 can be implemented in any suitable hardware, software, firmware, or combination thereof. However, for ease of explanation, in the embodiments described below, the method 300 may be considered to be implemented as described in the system.
[0040] At block 302, inputs of a user with a device may be received using a capacitive touch sensor mounted on each input key of the device. In one implementation, the inputs may be stored at a system database 220.
[0041] At block 304, each action performed by the device may be recorded in response to each input by using a camera mounted to capture the device. In one implementation, each action may be stored at the system database 220.
[0042] At block 306, differences between each action and a corresponding predefined action associated to each input may be determined. In one aspect, the differences are determined by using a neuromorphic hardware. In another aspect, the differences are one or more features of the device. In one implementation, the state may be determined by an identification module 214 along with the neuromorphic hardware and stored at the system database 220.
[0043] At block 308, the current output state of the device identified by the block 306 along with the inputs of the device will be used to identify the current state of the device using the neuromorphic hardware. If the current state is not in the neuromorphic hardware the system stores the knowledge in the neurons and also in the system database 220.
[0044] At block 310, a data log of the device may be generated based on a series of inputs and corresponding differences determined for each input on the neuromorphic hardware. In one implementation, the data log may be generated by a generation module 216 and stored at the system database 220.
[0045] Exemplary embodiments discussed above may provide certain advantages. Though not required to practice aspects of the disclosure, these advantages may include those provided by the following features.
[0046] Some embodiments enable a system and a method to generate data log of the device in a non-invasive manner.
[0047] Some embodiments enable a system and a method to dynamically generate data log of the mechanical, electrical and electromechanical device.
[0048] Some embodiments enable a system and a method to generate data log for old system and also in the new product that is being developed.
[0049] Some embodiments enable a system and a method to use a neuromorphic hardware to generate a log of the device.
[0050] Some embodiments enable a system and a method to speed-up the data logging in real-time.
[0051] Some embodiments enable a system and a method to generate the scripts for the automation and the inputs provided by the user will be used for the testing.
[0052] Although implementations for methods and systems for facilitating data logging of a device have been described in language specific to structural features and/or methods, it is to be understood that the appended claims are not necessarily limited to the specific features or methods described. Rather, the specific features and methods are disclosed as examples of implementations for facilitating data logging of the device.
Claims:
1. A method for facilitating data logging of a device (104), the method comprising:
receiving, by a processor (202), inputs of a user with a device (104) using a capacitive touch sensor mounted on each input key of the device (104);
recording, by the processor (202), each action performed by the device in response to each input by using a camera (110) mounted to capture the device (104);
extracting, by the processor (202), a plurality of feature vectors from an output of the camera (110) corresponding to each action in order to identify a current state of the device (104);
identifying, by a neuromorphic hardware (224), a set of unique feature vectors corresponding to the series of inputs by comparing each feature vector from the plurality of feature vectors with a predefined knowledge of feature vectors, wherein the set of unique feature vectors indicates the current state of the device; and
generating, by the processor (202), a data log of the device (104) based on the series of inputs, actions performed by the device (104) and the current state of the device.
2. The method of claim 1, wherein the inputs is received by at least one of a robotic arm, a touch screen, and a mouse.
3. The method of claim 2, wherein the touch screen has the capacitive touch sensor (106) to receive the inputs.
4. The method of claim 1, wherein the action is recorded in at least one of an image format, a GIF format, and a video format.
5. The method of claim 1, wherein the device (104) is a mechanical device, an electronic device, an electro-mechanical device.
6. The method of claim 1, wherein the neuromorphic hardware is configured to store the set of unique feature vectors of the action performed in the device along with the series of input
7. A system for facilitating data logging of a device (104), the system comprising:
a capacitive touch sensor (106) mounted on each input key of a device (104) to receive inputs of a user;
a camera (110) facing towards the device (104), wherein the camera (110) is configured to record each action performed by the device (104) in response to each input; and
a data logging hardware (102) coupled to the device to receive the inputs and each action recoded by the camera (110), wherein the data logging hardware (102) comprises
a processor (202);
a memory (206) coupled to the processor (202), wherein the processor (202) executes a set of instructions stored in the memory (206) to:
extract a plurality of feature vectors from an output of the camera (110) corresponding to each action in order to identify a current state of the device (104);
identify, by a neuromorphic hardware (224), a set of unique feature vectors corresponding to the series of inputs by comparing each feature vector from the plurality of feature vectors with a predefined knowledge of feature vectors, wherein the set of unique feature vectors indicates the current state of the device; and
generate a data log of the device (104) based on the series of inputs, actions performed by the device (104) and the current state of the device.
8. The system of claim 7, wherein the inputs is received by at least one of a robotic arm, a touch screen, and a mouse.
9. The system of claim 8, wherein the touch screen has the capacitive touch sensor (106) to receive the inputs.
10. The system of claim 7, wherein the action is recorded in at least one of an image format, a GIF format, and a video format.
11. The system of claim 7, wherein the device (104) is a mechanical device, an electronic device, an electro-mechanical device.
12. The system of claim 7, wherein the neuromorphic hardware (224) is configured to store the set of unique feature vectors of the action performed in the device along with the series of input.
13. A non-transitory computer readable medium embodying a program executable in a computing device for facilitating data logging of a device, the program comprising:
a program code for receiving inputs of a user with a device using a capacitive touch sensor mounted on each input key of the device;
a program code for recording each action performed by the device in response to each input by using a camera mounted to capture the device;
a program code for extracting a plurality of feature vectors from an output of the camera corresponding to each action in order to identify a current state of the device;
a program code for identifying a set of unique feature vectors corresponding to the series of inputs by comparing each feature vector from the plurality of feature vectors with a predefined knowledge of feature vectors, wherein the set of unique feature vectors indicates the current state of the device; and
a program code for generating a data log of the device based on the series of inputs, actions performed by the device and the current state of the device.
14. The method of claim 1, wherein the current state of the device is determined by a neuromorphic hardware (224) present in the data logging hardware (102), and wherein the current state is determined by
extracting a plurality of feature vectors from an output of the camera (110) corresponding to each action in order to identify a current state of the device;
comparing the extracted feature vectors for each action performed in the device with the predefined knowledge of the feature vectors; and
extracting the feature vectors from the series of input and identifying the current state of the device using the extracted feature along with the identified output state.
15. The system of claim 7, wherein the current state of the device is determined by a neuromorphic hardware (224) present in the data logging hardware (102), and wherein the current state is determined by
extracting a plurality of feature vectors from an output of the camera (110) corresponding to each action in order to identify a current state of the device;
comparing the extracted feature vectors for each action performed in the device with the predefined knowledge of the feature vectors; and
extracting the feature vectors from the series of input and identifying the current state of the device using the extracted feature along with the identified output state.
| # | Name | Date |
|---|---|---|
| 1 | 201911012615-STATEMENT OF UNDERTAKING (FORM 3) [29-03-2019(online)].pdf | 2019-03-29 |
| 2 | 201911012615-REQUEST FOR EXAMINATION (FORM-18) [29-03-2019(online)].pdf | 2019-03-29 |
| 3 | 201911012615-REQUEST FOR EARLY PUBLICATION(FORM-9) [29-03-2019(online)].pdf | 2019-03-29 |
| 4 | 201911012615-POWER OF AUTHORITY [29-03-2019(online)].pdf | 2019-03-29 |
| 5 | 201911012615-FORM-9 [29-03-2019(online)].pdf | 2019-03-29 |
| 6 | 201911012615-FORM 18 [29-03-2019(online)].pdf | 2019-03-29 |
| 7 | 201911012615-FORM 1 [29-03-2019(online)].pdf | 2019-03-29 |
| 8 | 201911012615-FIGURE OF ABSTRACT [29-03-2019(online)].jpg | 2019-03-29 |
| 9 | 201911012615-DRAWINGS [29-03-2019(online)].pdf | 2019-03-29 |
| 10 | 201911012615-COMPLETE SPECIFICATION [29-03-2019(online)].pdf | 2019-03-29 |
| 11 | abstract.jpg | 2019-05-08 |
| 12 | 201911012615-Proof of Right (MANDATORY) [04-06-2019(online)].pdf | 2019-06-04 |
| 13 | 201911012615-OTHERS-060619.pdf | 2019-06-12 |
| 14 | 201911012615-Correspondence-060619.pdf | 2019-06-12 |
| 15 | 201911012615-POA [09-07-2021(online)].pdf | 2021-07-09 |
| 16 | 201911012615-FORM 13 [09-07-2021(online)].pdf | 2021-07-09 |
| 17 | 201911012615-Proof of Right [13-10-2021(online)].pdf | 2021-10-13 |
| 18 | 201911012615-FER.pdf | 2021-10-22 |
| 19 | 201911012615-OTHERS [19-04-2022(online)].pdf | 2022-04-19 |
| 20 | 201911012615-FER_SER_REPLY [19-04-2022(online)].pdf | 2022-04-19 |
| 21 | 201911012615-DRAWING [19-04-2022(online)].pdf | 2022-04-19 |
| 1 | 201911012615_searchE_04-10-2021.pdf |