Sign In to Follow Application
View All Documents & Correspondence

Digital Writing Exchange

Abstract: The present disclosure relates to system(s) and method(s) for digital writing exchange is illustrated. The system comprises a memory and a processor coupled to the memory. The processor is configured to execute programmed instructions stored in the memory. In one embodiment, the processor may execute programmed instructions stored in the memory to receive a template comprising a plurality of graphical elements. The plurality of graphical elements may be configured to receive an input from an input device. Further, the processor may execute programmed instructions stored in the memory to assign an operation to a graphical element from the plurality of graphical elements based upon geometrical coordinates associated with the graphical element. Further, the processor may execute programmed instructions stored in the memory to superimpose the template on a machine readable pattern selected from a plurality of machine readable patterns.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
13 December 2017
Publication Number
08/2018
Publication Type
INA
Invention Field
ELECTRONICS
Status
Email
Parent Application
Patent Number
Legal Status
Grant Date
2019-03-01
Renewal Date

Applicants

TRATA E SYSTEMS PRIVATE LIMITED
216-217, Doshi Wadi, LBS Road, Ghatkopar (W), Mumbai - 400086, Maharashtra, India

Inventors

1. PANDYA, Rajesh Dineshchandra
216-217, Doshi Wadi, LBS Road, Ghatkopar (W), Mumbai - 400086, Maharashtra, India

Specification

Claims:1. A system for digital writing exchange, the system comprising:
a memory; and
a processor coupled to the memory, wherein the processor is configured to execute programmed instructions stored in the memory to:
receive a template comprising a plurality of graphical elements configured to receive an input;
assign an operation to a graphical element from the plurality of graphical elements based upon geometrical coordinates of the graphical element; and
superimpose the template on a machine readable pattern selected from a plurality of machine readable patterns.

2. The system of claim 1, further configured to determine geometrical coordinates of the graphical element, wherein the geometrical coordinates are based upon the machine readable pattern.

3. The system of claim 1, wherein the operation is triggered upon receiving the input, from an input device, on the graphical element.

4. The system of claim 1, wherein the plurality of graphical elements comprise labels, soft buttons, text boxes, images, and blank spaces.

5. The system of claim 1, wherein the operation is at least one of playing a video or an audio, sending emails or SMS, printing, recording voice or video, capturing data or images, and send data or images to one or more users.

6. The system of claim 1, further configured to generate a user record based upon the input received on the graphical element, wherein the user record comprises strokes of a user, a timestamp, a writing pressure of the user, the graphical element selected by the user for providing the input, and the operation triggered by the user.

7. The system of claim 6, wherein the strokes of the user, the timestamp, the writing pressure of the user, the graphical element selected by the user are encrypted and stored separately in one or more databases.

8. The system of claim 6, further configured to:
receive geometrical coordinates from the input device when the input is on the graphical element;
receive at least one of
the user record based upon the input on the graphical element, and
the input of the user on the graphical element for triggering an operation; and
process the geometrical coordinates using a mapping process to display at least one of the operation performed or the user record on a display screen.

9. The system of claim 7, further configured to:
divide the user record into sections, and
provide privileged access, to a user, of one or more sections based on a privilege level assigned to the user;
fetch the strokes of the user, the timestamp, the writing pressure of the user, the graphical element selected in the one or more sections from the one or more databases;
decrypt the strokes of the user, the timestamp, the writing pressure of the user, the graphical element selected in the one or more sections; and
merge the strokes of the user, the timestamp, the writing pressure of the user, the graphical element selected, for the user to view or modify the one or more sections.

10. The system of claim 1, further configured to:
generate a user profile associated with a user capable of providing the input, wherein the user profile comprises user identity data and user biometric data, wherein the user identity data comprises a machine readable pattern ID, a device ID, and a privileged level, and wherein the biometric data comprises writing style, writing pressure, and fingerprints of the user;
identify the user from a set of users based on the user identity data of the user;
validate the user based upon the biometric data of the user; and
authorize the user to generate one or more user records based on the privileged level associated with the user.

11. The system of claim 1, wherein the operation is selected from a set of predefined operations or a set of user defined operations, wherein the set of user defined operations are generated using Software Development Kit (SDK) associated with the input device.

12. The system of claim 1, further configured to:
process the input to identify a commencement time of the operation; and
trigger a notification to one or more users based on comparison of the commencement time with a predefined threshold.

13. The system of claim 1, further configured to:
detect new inputs on one or more graphical elements from one or more users; and
trigger a notification based on a privileged level associated with the one or more users.

14. The system of claim 1, further configured to:
process the inputs to identify strokes, timestamp associated with each stroke, and audio captured at each stroke, and
merge the strokes and audio captured at each stroke based on the timestamp associated with each stroke to generate a multimedia file.

15. A method for digital writing exchange, the method comprising steps of:
receiving, by a processor, a template comprising a plurality of graphical elements configured to receive an input;
assigning, by the processor, an operation to a graphical element from the plurality of graphical elements based upon geometrical coordinates of the graphical element; and
superimposing, by the processor, the template on a machine readable pattern selected from a plurality of machine readable patterns.
16. The method of claim 15, further comprise steps for determining geometrical coordinates of the graphical element, wherein the geometrical coordinates are based upon the machine readable pattern.

17. The method of claim 15, wherein the operation is triggered upon receiving the input, from an input device, on the graphical element.

18. The method of claim 15, wherein the plurality of graphical elements comprise labels, soft buttons, text boxes, images, and blank spaces.

19. The method of claim 15, wherein the operation is at least one of playing a video or an audio, sending emails or SMS, printing, recording voice or video, capturing data or images, and send data or images to one or more users.

20. The method of claim 15, further comprise steps for generating a user record based upon the input received on the graphical element, wherein the user record comprises strokes of a user, a timestamp, a writing pressure of the user, the graphical element selected by the user for providing the input, and the operation triggered by the user.

21. The method of claim 20, wherein the strokes of the user, the timestamp, the writing pressure of the user, the graphical element selected by the user are encrypted and stored separately in one or more databases.

22. The method of claim 20, further comprise steps for:
receiving geometrical coordinates from the input device when the input is on the graphical element;
receiving at least one of
the user record based upon the input on the graphical element, and
the input of the user on the graphical element for triggering an operation; and
processing the geometrical coordinates using a mapping process to display at least one of the operation performed or the user record on a display screen.

23. The method of claim 21, further comprise steps for:
dividing the user record into sections, and
providing privileged access, to a user, of one or more sections based on a privilege level assigned to the user;
fetching the strokes of the user, the timestamp, the writing pressure of the user, the graphical element selected in the one or more sections from the one or more databases;
decrypting the strokes of the user, the timestamp, the writing pressure of the user, the graphical element selected in the one or more sections; and
merging the strokes of the user, the timestamp, the writing pressure of the user, the graphical element selected, for the user to view or modify the one or more sections.

24. The method of claim 15, further comprise steps for:
generating a user profile associated with a user capable of providing the input, wherein the user profile comprises user identity data and user biometric data, wherein the user identity data comprises a machine readable pattern ID, a device ID, and a privileged level, and wherein the biometric data comprises writing style, writing pressure, and fingerprints of the user;
identifying the user from a set of users based on the user identity data of the user;
validating the user based upon the biometric data of the user; and
authorizing the user to generate one or more user records based on the privileged level associated with the user.

25. The method of claim 15, wherein the operation is selected from a set of predefined operations or a set of user defined operations, wherein the set of user defined operations are generated using Software Development Kit (SDK) associated with the input device.

26. The method of claim 15, further comprise steps for:
processing the input to identify a commencement time of the operation; and
triggering a notification to one or more users based on comparison of the commencement time with a predefined threshold.

27. The method of claim 15, further comprise steps for:
detecting new inputs on one or more graphical elements from one or more users; and
triggering a notification based on a privileged level associated with the one or more users.

28. The method of claim 15, further comprise steps for:
processing the inputs to identify strokes, timestamp associated with each stroke, and audio captured at each stroke, and
merging the strokes and audio captured at each stroke based on the timestamp associated with each stroke to generate a multimedia file.

29. A computer program product having embodied thereon a computer program for digital writing exchange, the computer program product comprises:
a program code for receiving a template comprising a plurality of graphical elements configured to receive an input;
a program code for assigning an operation to a graphical element from the plurality of graphical elements based upon geometrical coordinates of the graphical element; and
a program code for superimposing the template on a machine readable pattern selected from a plurality of machine readable patterns.
, Description:FORM 2

THE PATENTS ACT, 1970
(39 of 1970)
&
THE PATENT RULES, 2003

COMPLETE SPECIFICATION
(See Section 10 and Rule 13)

Title of invention:
DIGITAL WRITING EXCHANGE

Applicant:
PANDYA, Rajesh Dineshchandra
An Indian national having address as:
216-217, Doshi Wadi, LBS Road,
Ghatkopar (W), Mumbai - 400086,
Maharashtra, India

The following specification describes the invention and the manner in which it is to be performed.
CROSS-REFERENCE TO RELATED APPLICATIONS AND PRIORITY
[001] The present application does not claim priority from any patent application.

TECHNICAL FIELD
[002] The present disclosure in general relates to the field of digital writing. More particularly, the present invention relates to a system and method for digital writing exchange.

BACKGROUND
[003] Pen and paper is traditionally used for recording information. Information recorded on paper may be associated with banks, hospitals, offices, schools, colleges, and the like. However, with the growth in IT sector, digital record generation has increased drastically. The digital records are easier to maintain and update. For generating digital records using a computer based interface, a completely different user interface and input devices such as a keyboard and a mouse is used. Recording information using these input devices is time consuming and tedious for a new user as compared to the traditional pen and paper approach.
[004] To overcome the above problems, the concept of digital writing was introduced. The digital writing may be performed using a digital writing pen and a digital writing paper. The digital writing pen is configured to capture user’s writing strokes and record information digitally. The user’s writing strokes are captured using a machine readable pattern printed over a digital writing paper. The writing strokes with timestamp information is processed in order to generate digital records. The digital records may be stored at a database or shared with other users.
[005] However, in the digital writing domain, the generation of digital record is largely hardware dependant. If the digital writing pen is changed, the digital writing paper/template also needs to be replaced. Furthermore, only a predefined set of operations/ functionalities can be incorporated on the digital writing paper. There are a lot of limitations for defining custom functions/ operations that can be triggered using the digital writing paper.

SUMMARY
[006] Before the present systems and method for digital writing exchange is illustrated. It is to be understood that this application is not limited to the particular systems, and methodologies described, as there can be multiple possible embodiments that are not expressly illustrated in the present disclosure. It is also to be understood that the terminology used in the description is for the purpose of describing the particular versions or embodiments only, and is not intended to limit the scope of the present application. This summary is provided to introduce concepts related to systems and method for generating custom digital writing templates. This summary is not intended to identify essential features of the claimed subject matter nor is it intended for use in determining or limiting the scope of the claimed subject matter.
[007] In another implementation, a system for digital writing exchange is illustrated. The system comprises a memory and a processor coupled to the memory. The processor is configured to execute programmed instructions stored in the memory. In one embodiment, the processor may execute programmed instructions stored in the memory to receive a template comprising a plurality of graphical elements. The plurality of graphical elements may be configured to receive an input from an input device. Further, the processor may execute programmed instructions stored in the memory to assign an operation to a graphical element from the plurality of graphical elements based upon geometrical coordinates associated with the graphical element. Further, the processor may execute programmed instructions stored in the memory to superimpose the template on a machine readable pattern selected from a plurality of machine readable patterns.
[008] In one implementation, a method for digital writing exchange is illustrated. The method may comprise steps for receiving, by a processor, a template comprising a plurality of graphical elements. The plurality of graphical elements may be configured to receive an input from an input device. The method may further comprise steps for assigning, by the processor, an operation to a graphical element from the plurality of graphical elements based upon geometrical coordinates associated with the graphical element. The method may further comprise steps for superimposing, by the processor, the template on a machine readable pattern selected from a plurality of machine readable patterns.
[009] In yet another implementation, a computer program product having embodied computer program for digital writing exchange is disclosed. The program may comprise a program code for receiving a template comprising a plurality of graphical elements. The plurality of graphical elements may be configured to receive an input from an input device. The program may comprise a program code for assigning an operation to a graphical element from the plurality of graphical elements based upon geometrical coordinates associated with the graphical element. The program may comprise a program code for superimposing the template on a machine readable pattern selected from a plurality of machine readable patterns.
BRIEF DESCRIPTION OF DRAWINGS
[0010] The detailed description is described with reference to the accompanying figures. In the figures, the left-most digit(s) of a reference number identifies the figure in which the reference number first appears. The same numbers are used throughout the drawings to refer like features and components.
[0011] Figure 1 illustrates a network implementation of a system for digital writing exchange, in accordance with an embodiment of the present subject matter.
[0012] Figure 2 illustrates the system for digital writing exchange, in accordance with an embodiment of the present subject matter.
[0013] Figure 3 illustrates a method for digital writing exchange, in accordance with an embodiment of the present subject matter.
[0014] Figure 4 illustrates a digital writing template 400 for recording medical information of a patient, in accordance with an embodiment of the present subject matter.
[0015] Figure 5 illustrates a digital writing template 500 for recording logistics information in a transportation agency, in accordance with an embodiment of the present subject matter.

DETAILED DESCRIPTION
[0016] Some embodiments of the present disclosure, illustrating all its features, will now be discussed in detail. The words “receiving”, “assigning”, “superimposing”, “processing”, “dividing”, “providing”, “fetching”, “decrypting”, “merging”, “identifying, validating”, “authorizing”, “triggering”, “detecting”, and other forms thereof, are intended to be equivalent in meaning and be open ended in that an item or items following any one of these words is not meant to be an exhaustive listing of such item or items, or meant to be limited to only the listed item or items. It must also be noted that as used herein and in the appended claims, the singular forms "a", "an" and "the" include plural references unless the context clearly dictates otherwise. Although any systems and methods similar or equivalent to those described herein can be used in generating custom templates for digital writing, the exemplary, systems and method to generate custom templates for digital writing exchange is now described. The disclosed embodiments of the system and method for generating custom templates for digital writing exchange are merely exemplary of the disclosure, which may be embodied in various forms.
[0017] Various modifications to the embodiment will be readily apparent to those skilled in the art and the generic principles herein may be applied to other embodiments. However, one of ordinary skill in the art will readily recognize that the present disclosure for digital writing exchange is not intended to be limited to the embodiments illustrated, but is to be accorded the widest scope consistent with the principles and features described herein.
[0018] In one embodiment, a system for digital writing exchange is illustrated. The system may be communicatively coupled to a set of partner platforms for enabling different partners to develop software/ mobile applications for digital writing. Furthermore, a user may access the system through the partner system/ platform and generate one or more templates for digital writing. Initially the system may be configured to enable a user to log into the system using authentication means such as login ID & password, biometric authentication and the like. Once the user logs into the system, the system is configured to receive a template from the user. The template may comprise a plurality of graphical elements. The template may be in the form of coral draw file, a pdf file, or any other type of design file. In one embodiment, the system may enable a graphical user interface, the graphical user interface may display a set of operations to be linked to one or more graphical elements. The set of operations may be displayed in the form of icons. The user may assign an operation to a graphical element by dragging and dropping the icon over the graphical element. In one embodiment, the one or more operations may be indicative of a set of machine executable instructions to be executed when an input device (Example Digital writing pen) interacts with the graphical element.
[0019] The system may further be configured to superimpose a pattern, selected from a plurality of patterns, over the template to generate a digital writing template. Once the digital writing template is generated, in the next step, the user may print the digital writing template and accordingly use the digital writing template for recording information associated with banks, hospitals, offices, schools, colleges, and the like.
[0020] In one embodiment, the system may be configured to enable a user to develop a software application for assisting the user in digital writing. The software application may assist the user in real-time streaming of user actions over the digital writing template as well as trigger one or more operations associated with the graphical element on the printed digital writing template. Further, the network implementation of system configured for digital writing exchange is further illustrated with respect to Figure 1.
[0021] Referring now to Figure 1, a network implementation 100 of a system 102 for digital writing exchange is disclosed. Although the present subject matter is explained considering that the system 102 is implemented on a server, it may be understood that the system 102 may also be implemented in a variety of computing systems, such as a laptop computer, a desktop computer, a notebook, a workstation, a mainframe computer, a server, a network server, and the like. In one implementation, the system 102 may be implemented over a server. Further, the server may be part of a cloud network. The system 102 may be configured to communicate with a partner platform 108. The partner platform 108 may enable a set of user devices 104 to directly connected to the system 102 or through the partner platform 108. In one embodiment, a user may use the partner platform 108 to develop an application 110 as well as generate a digital writing template 114. In another embodiment, the user may directly communicate with the system 102 to develop the application 110 and generate the digital writing template 114. Once the application 110 is developed, the application 110 is deployed over the user device 104. The user may link an input device 112 with the application 110 installed over the user device 104 and start using the input device 112 over the digital writing template 114 for information capturing and user record generation. The user record may be maintained locally at the user device 104 or centrally at the system 102. In one embodiment, the system 102 may enable the user to trigger different operations over the user device 104 using the input device 112 and the digital writing template 114.
[0022] Further, it will be understood that the system 102 may be accessed by multiple users through one or more user devices 104-1, 104-2…104-N, collectively referred to as user device 104 hereinafter, or application 110 residing on the user device 104. Examples of the user device 104 may include, but are not limited to, a portable computer, a personal digital assistant, a handheld device, and a workstation. In one embodiment, the user device 104 may be communicatively coupled to the system 102 through a network 106.
[0023] In one implementation, the network 106 may be a wireless network, a wired network or a combination thereof. The network 106 may be implemented as one of the different types of networks, such as intranet, local area network (LAN), wide area network (WAN), the internet, and the like. The network 106 may either be a dedicated network or a shared network. The shared network represents an association of the different types of networks that use a variety of protocols, for example, Hypertext Transfer Protocol (HTTP), Transmission Control Protocol/Internet Protocol (TCP/IP), Wireless Application Protocol (WAP), and the like, to communicate with one another. Further, the network 106 may include a variety of network devices, including routers, bridges, servers, computing devices, storage devices, and the like. Further, the process of generation the digital writing template 114 using the system 102 is further elaborated with respect to figure 2.
[0024] Referring now to figure 2, the system 102 for digital writing exchange is illustrated in accordance with an embodiment of the present subject matter. In one embodiment, the system 102 may include at least one processor 202, an input/output (I/O) interface 204, and a memory 206. The at least one processor 202 may be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, state machines, logic circuitries, and/or any devices that manipulate signals based on operational instructions. Among other capabilities, at least one processor 202 may be configured to fetch and execute computer-readable instructions stored in the memory 206.
[0025] The I/O interface 204 may include a variety of software and hardware interfaces, for example, a web interface, a graphical user interface, and the like. The I/O interface 204 may allow the system 102 to interact with the user directly or through the user device 104. Further, the I/O interface 204 may enable the system 102 to communicate with other computing devices, such as web servers and external data servers (not shown). The I/O interface 204 may facilitate multiple communications within a wide variety of networks and protocol types, including wired networks, for example, LAN, cable, etc., and wireless networks, such as WLAN, cellular, or satellite. The I/O interface 204 may include one or more ports for connecting a number of devices to one another or to another server. The I/O interface 204 may enable the user to access the system 102 through the partner platform 108 or directly using the user device 104.
[0026] The memory 206 may include any computer-readable medium known in the art including, for example, volatile memory, such as static random access memory (SRAM) and dynamic random access memory (DRAM), and/or non-volatile memory, such as read only memory (ROM), erasable programmable ROM, flash memories, hard disks, optical disks, and magnetic tapes. The memory 206 may include modules 208 and data 210.
[0027] The modules 208 may include routines, programs, objects, components, data structures, and the like, which perform particular tasks, functions or implement particular abstract data types. In one implementation, the module 208 may include a template capturing module 212, a template calibration 214, a template generation module 216, a template consumption module 218 and other modules 220. The other modules 220 may include programs or coded instructions that supplement applications and functions of the system 102.
[0028] The database 210, amongst other things, serve as a repository for storing data processed, received, and generated by one or more of the modules 208. The database 210 may be configured to maintain template data 228, and other data 230. In one embodiment, the other data 230 may include data generated as a result of the execution of one or more modules in the other modules 220.
[0029] In one implementation, a user may access the system 102 via the I/O interface 204. The user may be registered using the I/O interface 204 in order to use the system 102. In one aspect, the user may access the I/O interface 204 of the system 102 for obtaining information, providing inputs or configuring the system 102. In one embodiment, the system 102 may be accessed from the partner platform 108 or the user device 104. The function of each module in the system 102 is explained as below.
TEMPLATE CAPTURING MODULE 212
[0030] In one embodiment, the template capturing module 212 is configured for receiving a template. The template may be received from the partner platform 108. The template may comprise a plurality of graphical elements. The plurality of graphical elements may comprise labels, soft buttons, text boxes, images, blank spaces and the like. The template may be in the form of coral draw file, a pdf file, or any other type of design file. In one embodiment, the template may be designed using a template designing tool and then uploaded to the system 102.
TEMPLATE CALIBRATION MODULE 214
[0031] In one embodiment, the template calibration module 214 may enable a graphical user interface for assigning one or more operations to the template. The graphical user interface may display a plurality of operations to be linked with a graphical element from the plurality of graphical elements in the template. The plurality of operations may be displayed in the form of pick and place icons. The plurality of operations may comprise playing a video or an audio file, sending emails or SMS (Short Message Service), printing file, recording voice or video, capturing data or images, and send data or images to one or more users. The user may assign an operation to the graphical element by dragging and dropping the icon over the graphical element. As soon as the icon is dropped over a graphical element, the template calibration module 214 is configured to identify the geometrical coordinates associated with the graphical element. For example, if the graphical element is a rectangular text box, then the template calibration module 214 is configured to identify geometrical coordinates of each point of the sides of the rectangular text box and record this information as geometrical coordinates of the rectangular text box. In one embodiment, the geometrical coordinates represent shape, size, and location of the graphical element in the template.
[0032] In one embodiment, the template calibration module 214 may be configured to determine geometrical coordinates of the graphical element based upon a machine readable pattern. For determining the coordinates using machine readable pattern, the template calibration module 214 may superimposed the machine readable pattern on the template and identify a sub-pattern/ subsection of the machine readable pattern that is superimposed over the graphical element. The sub-pattern may be saved in the form of geometrical coordinated associated with the graphical element. Furthermore, the template calibration module 214 may be configured to maintain a table storing geometrical coordinates and operation assigned to the graphical element. The table may be stored in the form of template data 228 in the database 210. In a similar manner, the user may assign one or more operations to other graphical elements of the template and the template data 228 may be updated accordingly. In one embodiment, the machine readable pattern used for identification of the geometrical coordinates may further be used for generating the digital writing template 114.
[0033] In one embodiment, the operation may correspond to a set of machine executable instructions/ software code to be executed when the input device 112 interacts with the graphical element. The interaction may be in the form of a single or multi touch by the user device 112, writing over the graphical element, hovering the input device over the graphical element and the like.
[0034] In one embodiment, the template calibration module 214 may be used for assigning an operation of “playing a video over the user device 104” to a graphical element such as a “soft button”. In this case, the operation of playing a video is mapped with the geometrical coordinates of the “soft button”. As soon as the user interacts with the “soft button” using the input device 112, the application 110 associated with input device 112 is configure to play the video over the user device 104. In a similar manner, operation and geometrical coordinates may be assigned to other graphical elements in the template.
[0035] In one embodiment, the operation is selected from a set of predefined operations or a set of user defined operations. In one embodiment, the set of predefined operations may comprise standard set of operation such as “printing a file”, “send email”, “download a file”, “open a file”, and the like. In one embodiment, the set of user defined operations may be generated using a Software Development Kit (SDK) associated with the input device 112 and the user device 104. For example, the user may generate a user defined operation such as “recording voice at the user device 104 for an interval of 10 sec”, “displaying a notification on the user device 104”, “capturing images by the user device 104”, or “operating lighting system in a smart home”. Such user defined operations may be generated by the user using the software development kit (SDK).
TEMPLATE GENERATION MODULE 216
[0036] The template generation module 216 may further be configured to superimpose the template on the machine readable pattern selected from a plurality of machine readable patterns to generate the digital writing template 114. The machine readable pattern may be associated with the input device 112 that is selected by the user. For example, if the user selects AnotoTM digital writing pen as the input device 112, then the template generation module 216 is configured to superimpose AnotoTM Machine readable pattern on the template. Once the digital writing template 114 is generated, in the next step, the user may print the digital writing template 114 and accordingly use the digital writing template 114 for recording information using AnotoTM digital writing pen. In one embodiment, the machine readable pattern may be selected from a dot pattern, a dash pattern, or any other type of pattern that may be scanned/ readable using the input device 112.
TEMPLATE CONSUMPTION MODULE 218
[0037] In one embodiment, once the digital writing template 114 is generated, the template consumption module 218 may be configured to generate a user profile associated with each user from a set of users associated with the digital writing template 114. The user profile may comprise user identity data and user biometric data. The user identity data may comprise a machine readable pattern ID, a device ID, and a privileged level associated with the user. Further, the biometric data may comprise writing style, writing pressure, and fingerprints of the user. The user identify data may be used for correctly identifying the user operating the input device 112 on the digital writing template 114. For example, the machine readable pattern ID and device ID captured from the input device 112, when the user operated the input device 112 over the digital writing template 114, may be compared with the profile data associated with each user to identify the user of the input device 112 and the digital writing template 114. Once the user is identified, the input received from the input device 112 is analysed and compared with the biometric data of the user to validate if the user himself is providing input on the digital writing template 114. In one embodiment, the validation is performed by comparing the biometric data associated with the user profile with the biometric data generated based on analysis of the input provided by the user. If the user is identified as an authentic user, then the template consumption module 218 is configured to authorize the user to generate one or more user records based on the privileged level associated with the user and store the user records in the database 210.
[0038] In one embodiment, the user record, generated by the template consumption module 218, may comprise one or more of writing strokes of a user, a timestamp, a writing pressure of the user, the graphical element selected by the user for providing the input, and the operations triggered by the user based on interaction between the user device 112 and the graphical elements. In one embodiment, the strokes of the user, the timestamp, the writing pressure of the user, the graphical element selected by the user may be encrypted using one or more encryptions algorithms available in the art and stored separately in the database 210. Alternately, the encrypted data may be stored separately over more than one databases.
[0039] In one embodiment, the template consumption module 218 may be configured to capture geometrical coordinates when the user operates the input device 112 on the digital writing templates 114. Based on the interaction between the input device 112 and the digital writing templates 114, the template consumption module 218 is configured to generate the user record in the form of user strokes on at least one graphical element, and the input of the user on the graphical element for triggering the operation linked with the graphical element. The geometrical coordinates may be processed and compared with the template data 228 using a mapping process to display at least one of the operation performed or the user record generated on a display screen of the user device 104.
[0040] In one embodiment, the user device 112 may be a digital writing pen. The user may use the digital writing pen to write over the digital writing template 114. As soon as the user starts writing on the graphical element (Example Text box), the template consumption module 218 is configured to trigger operation associated with the text box. The template consumption module 218 may also be configured to display writing strokes of the user over the display screen of the user device 104. The writing strokes may be displayed in the application 110 installed on the user device 104. Furthermore, the inputs provided by the user on the digital writing template 114 may be recorded in the form of user record and stored over the database 210.
[0041] In one embodiment, once the user record is captured, the template consumption module 218 may divide the user record into sections. The template consumption module 218 is further configured to provide privileged access to one or more users for accessing each subsection based on privilege level associated with the one or more users. Each section may be accessible to the users of the system 102 based on privilege level associated with the users. In one embodiment, the application 110 may enable the user to log into the system 102. Once the user logs into the system 102, the template consumption module 218 may determine privilege level associated with the user. Furthermore, based on the privilege level associated with the user, the template consumption module 218 may allow or deny the user to access the one or more sections of the user record. If the privilege level associated with the user is equal or higher than the privilege level assigned to a section, then the user may access the inputs recorded in the section.
[0042] In one embodiment, when the user is a privileged user of the section, the template consumption module 218 may fetch the strokes of the user, the timestamp, the writing pressure of the user, the graphical element selected in the sections from the one or more databases at which the user record in maintained. Further, the template consumption module 218 is configured to decrypt the strokes of the user, the timestamp, the writing pressure of the user, the graphical element selected in that section. Finally, the template consumption module 218 may merge the strokes of the user, the timestamp, the writing pressure of the user, and the graphical element selected in order to display the section using the application 110 and enable the user to view or modify the one or more sections.
[0043] In one embodiment, the template consumption module 218 may process the input to identify a commencement time of the operation. Based on the commencement time of the operation, the template consumption module 218 may compare of the commencement time with a predefined threshold and trigger a notification to one or more users associated with the digital writing template 214. In one embodiment, the template consumption module 218 may process the input to identify commencement time when the user device 112 interacts with a submit button on the digital writing template 114. Based on the commencement time, one or more other users associated with the system 102 may be configured to receive a notification that the user has submitted the operation on the digital writing template 114. Based on the submission, the template consumption module 218 is configured to trigger a notification stating that the activity has been completed. In another embodiment, the template consumption module 218 may determine delay in execution of the operation based on the comparison between the commencement time and predefined threshold. Furthermore, based on delay in execution of the operation, a notification may be triggered to other users.
[0044] In one embodiment, the template consumption module 218 may detect new inputs on one or more graphical elements from one or more users of the system 102. Based on the privilege level associated with the one or more users, the template consumption module 218 may trigger a notification to the user of the digital writing template 114.
[0045] In one embodiment, the template consumption module 218 may process the inputs to identify one or more strokes, timestamp associated with each stroke, and audio captured at each stroke. Further, the template consumption module 218 may merge the strokes and audio captured at each stroke based on the timestamp associated with each stroke to generate a multimedia file. The multimedia file may be used by other users for training or guidance purpose. Further, method for digital writing exchange is illustrated with respect to figure 3.
[0046] Referring now to figure 3, a method 300 for digital writing exchange, is disclosed in accordance with an embodiment of the present subject matter. The method 300 may be described in the general context of computer executable instructions. Generally, computer executable instructions can include routines, programs, objects, components, data structures, procedures, modules, functions, and the like, that perform particular functions or implement particular abstract data types. The method 300 may also be practiced in a distributed computing environment where functions are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, computer executable instructions may be located in both local and remote computer storage media, including memory storage devices.
[0047] The order in which the method 300 is described is not intended to be construed as a limitation, and any number of the described method blocks can be combined in any order to implement the method 300 or alternate methods. Additionally, individual blocks may be deleted from the method 300 without departing from the spirit and scope of the subject matter described herein. Furthermore, the method 300 can be implemented in any suitable hardware, software, firmware, or combination thereof. However, for ease of explanation, in the embodiments described below, the method 300 may be considered to be implemented in the above described system 102.
[0048] At block 302, the template capturing module 212 is configured for receiving a template. The template may be received from the partner platform 108. The template may comprise a plurality of graphical elements. The plurality of graphical elements may comprise labels, soft buttons, text boxes, images, blank spaces and the like. The template may be in the form of coral draw file, a pdf file, or any other type of design file. In one embodiment, the template may be designed using a template designing tool and then uploaded to the system 102.
[0049] At block 304, the template calibration module 214 may enable a graphical user interface for assigning one or more operations to the template. The graphical user interface may display a plurality of operations to be linked with a graphical element from the plurality of graphical elements in the template. The plurality of operations may be displayed in the form of pick and place icons. The plurality of operations may comprise playing a video or an audio file, sending emails or SMS (Short Message Service), printing file, recording voice or video, capturing data or images, and send data or images to one or more users. The user may assign an operation to the graphical element by dragging and dropping the icon over the graphical element. As soon as the icon is dropped over a graphical element, the template calibration module 214 is configured to identify the geometrical coordinates associated with the graphical element. For example, if the graphical element is a rectangular text box, then the template calibration module 214 is configured to identify geometrical coordinates of each point of the sides of the rectangular text box and record this information as geometrical coordinates of the rectangular text box. In one embodiment, the geometrical coordinates represent shape, size, and location of the graphical element in the template.
[0050] In one embodiment, the template calibration module 214 may be configured to determine geometrical coordinates of the graphical element based upon a machine readable pattern. For determining the coordinates using machine readable pattern, the template calibration module 214 may superimposed the machine readable pattern on the template and identify a sub-pattern/ subsection of the machine readable pattern that is superimposed over the graphical element. The sub-pattern may be saved in the form of geometrical coordinated associated with the graphical element. Furthermore, the template calibration module 214 may be configured to maintain a table storing geometrical coordinates and operation assigned to the graphical element. The table may be stored in the form of template data 228 in the database 210. In a similar manner, the user may assign one or more operations to other graphical elements of the template and the template data 228 may be updated accordingly. In one embodiment, the machine readable pattern used for identification of the geometrical coordinates may further be used for generating the digital writing template 114.
[0051] In one embodiment, the operation may correspond to a set of machine executable instructions/ software code to be executed when the input device 112 interacts with the graphical element. The interaction may be in the form of a single or multi touch by the user device 112, writing over the graphical element, hovering the input device over the graphical element and the like.
[0052] In one embodiment, the template calibration module 214 may be used for assigning an operation of “playing a video over the user device 104” to a graphical element such as a “soft button”. In this case, the operation of playing a video is mapped with the geometrical coordinates of the “soft button”. As soon as the user interacts with the “soft button” using the input device 112, the application 110 associated with input device 112 is configure to play the video over the user device 104. In a similar manner, operation and geometrical coordinates may be assigned to other graphical elements in the template.
[0053] In one embodiment, the operation is selected from a set of predefined operations or a set of user defined operations. In one embodiment, the set of predefined operations may comprise standard set of operation such as “printing a file”, “send email”, “download a file”, “open a file”, and the like. In one embodiment, the set of user defined operations may be generated using a Software Development Kit (SDK) associated with the input device 112 and the user device 104. For example, the user may generate a user defined operation such as “recording voice at the user device 104 for an interval of 10 sec”, “displaying a notification on the user device 104”, “capturing images by the user device 104”, or “operating lighting system in a smart home”. Such user defined operations may be generated by the user using the software development kit (SDK).
[0054] At block 306, the template generation module 216 may further be configured to superimpose the template on the machine readable pattern selected from a plurality of machine readable patterns to generate the digital writing template 114. The machine readable pattern may be associated with the input device 112 that is selected by the user. For example, if the user selects AnotoTM digital writing pen as the input device 112, then the template generation module 216 is configured to superimpose AnotoTM Machine readable pattern on the template. Once the digital writing template 114 is generated, in the next step, the user may print the digital writing template 114 and accordingly use the digital writing template 114 for recording information using AnotoTM digital writing pen. . In one embodiment, the Machine readable pattern may be in the form of a dot pattern, a dash pattern, or any other type of pattern that may be scanned/ readable using the input device 112. The process of using the digital writing template for user record generation is further elaborated with respect to figure 4.
[0055] Referring now to figure 4, a digital writing template 400 assigned to a doctor for medical record generation is illustrated. Initially, the doctor may submit a template, to the partner platform 108. The partner platform 108 may further communicate with the system 102 for converting the template into the digital writing template 400. In one embodiment, at the time of generating the digital writing template 400, the system 102 may assign operation to one or more graphical elements in the digital writing template 400 based on the inputs received from the doctor. Further, the application 110 for assisting the digital writing process may also be generated at the system 102 using the template and operation specified by the doctor. Once the application 110 is developed, the application 110 may be installed over the user device 104 of the doctor. Further, the digital writing template 400 may be printed and used by the doctor to capture medical record of a patient.
[0056] In one embodiment, the digital writing template 400 may comprise sections 402, 404, 406 and 408. The section 402 may enable recording personal details of a patient using graphical elements 402a such as name, age, gender, date of visit, mobile number and symptoms identified by the doctor and the input device 112. The input device may be a digital writing pen. Further, the section 402 may also comprise graphical element in the form of a soft button (Example show history 402b) which allows the doctor to view history of the patient over the user device 104. Further, the section 404 may be used for recording treatment/ medication details suggested by the doctor to the patient. The section 404 may comprise graphical elements such as checkboxes 404a to record time and frequency of medication. Further, the section 406 may comprise graphical elements 406a and 406b to specify one or more diagnostic test to be performed and the lab for performing the diagnostic tests. Section 408 may corresponds to a set of soft buttons 408a, 408b……..408n for assisting the doctor in recording the information as well as triggering the operations associated with the digital writing template 400.
[0057] In one embodiment, once the digital writing template 114 is generated, the template consumption module 218 may be configured to generate a user profile associated with each user from a set of users of the digital writing template 114. The set of users may comprise the doctor, one or more junior doctors, one or more pharmacists, one or more assistants associated with diagnostics labs and the like. In one embodiment, the user profile of the doctor may comprise user identity data and user biometric data of the doctor. The user identity data may comprise a machine readable pattern ID, a device ID, and a privileged level associated with the doctor. Further, the biometric data may comprise writing style, writing pressure, and fingerprints of the doctor. The user identify data may be used for correctly identifying if the doctor is operating the input device 112 on the digital writing template 400. For example, the machine readable pattern ID and device ID captured from the input device 112 may be compared with the profile data associated with each user to identify the user operating the input device 112 and the digital writing template 400 currently operated by the user. Once the user is identified as the doctor, the input received from the input device 112 is analysed and compared with the biometric data of the doctor to validate if the doctor himself is providing input on the digital writing template 114. In one embodiment, the validation is performed by comparing the biometric data associated with the user profile with the biometric data captured from the input device 112. If the user is identified as the doctor based on biometric comparison, then the template consumption module 218 is configured to authorize the doctor to generate one or more records based on the privileged level associated with the doctor.
[0058] In one embodiment, the record generated by the template consumption module 218 may comprise strokes of a doctor, a timestamp, a writing pressure of the doctor, the graphical element selected by the doctor for providing the input, and the operation triggered by the doctor based on interaction between the user device 112 and the graphical element on the digital writing pattern 114. In one embodiment, input received in each section is recorded separately. In one embodiment, the strokes of the doctor, the timestamp, the writing pressure of the doctor, the graphical element selected by the doctor at each section 402, 404, 406 and 408 may be encrypted using one or more encryption algorithms available in the art and stored separately in the database 210. Alternately, the encrypted data may be stored separately over more than one databases.
[0059] In one embodiment, the template consumption module 218 may be configured to capture geometrical coordinates, when the input device 112 interacts with the digital writing templates 400, using the machine readable pattern. The geometrical coordinates are compared with geometrical coordinates associated with each of the graphical elements to determine the graphical element that is currently receiving inputs from the doctor. For example, the doctor may enter the patients mobile number on the graphical element 402a using the digital writing pen. The template consumption module 218 is configured to detect the geometrical coordinates, when the digital writing pen interacts with the graphical element 402a. Further, the geometrical coordinates are then compared with the template data 228 to identify the graphical element as the mobile number and accordingly record the mobile number entered by the doctor in a structure manner. In similar manner, the template consumption module 218 is configured to generate at least one of the user record in the form of user strokes on the graphical elements at one or more sections of the digital writing template 400. The record may also comprise the operations triggered due the interaction between the digital writing template 400 and the digital writing pen.
[0060] Furthermore, the geometrical coordinates detected by the digital writing pen may be processed and compared with the template data 228 using a mapping process to display at least one of the operation triggered by the doctor or the user record on a display screen of the user device 104. For example, the doctor may use the digital writing pen to write over the digital writing template 400. As soon as the doctor starts writing on the graphical element (Example Name field), the template consumption module 218 is configured to trigger operation associated with the name field. The template consumption module 218 may also be configured to enable real-time simulation of the inputs by display writing strokes of the doctor over the display screen of the user device 104. The writing strokes may be displayed in the application 110 enabled over the user device 104. Furthermore, the inputs provided by the doctor on the digital writing template 400 may be recorded in the form of user record and stored over the database 210.
[0061] In one embodiment, once the user record is captured, the template consumption module 218 may divide the user record based on the sections 402, 404, 406 and 408. The template consumption module 218 is further configured to provide privileged access for one or more users to access each section based on privilege level associated with the one or more users. Each section may be accessible to the users associated with the digital writing template 400 based on privilege level associated with the users. In one embodiment, the application 110 may enable a pharmacist to log into the system 102. Once the pharmacist logs into the system 102, the template consumption module 218 may determine privilege level associated with the pharmacist. Furthermore, based on the privilege level associated with the pharmacist, the template consumption module 218 may allow the pharmacist to access the section 404. If the privilege level associated with the user is equal or higher than the privilege level assigned to a particular section, then the user may access part of the record that corresponds to the section.
[0062] In one example, the digital writing templates 400 may be in the form of a prescription. The doctor may write patient information and a list of medicines on the prescription using the input device 112. At the same time, the template consumption module 218 may be configured to store information associated with the list of medicines and a pattern ID associated with the prescription, over the database 210. Further, the application 110 may enable the pharmacist to log into the system 102 and register a pharmacist’s digital writing pen with the system 102. Once the pharmacist logs into the system 102, the template consumption module 218 may enable the pharmacist to determine authenticity of the prescription, shared by a patient, using the digital writing pen of the pharmacist. For example, the pharmacist may touch his digital writing pen at a predefined location on the prescription. Further, the pharmacist’s digital writing pen may detect Pattern ID associated with the prescription. The pattern ID may then be used by the template consumption module 218 in order fetch the patient information and the list of medicines associated with the prescription. This information may be displayed over a user device of the pharmacist. The pharmacist may then compare information written on the prescription and the information displayed on the user device in order to validate the prescription. The pharmacist may provide his inputs (prescription honoured, prescription partially honoured, and the like) on the prescription, which are saved in the database 210 for future reference by another pharmacist or the doctor.
[0063] In one example, for accessing the inputs associated with a particular section, the template consumption module 218 may fetch the strokes of the user, the timestamp, the writing pressure of the user, the graphical element selected in the section 404 from the one or more databases at which the user record in maintained. Further, the template consumption module 218 is configured to decrypt the strokes of the user, the timestamp, the writing pressure of the user, the graphical element selected in the section 404. Finally, the template consumption module 218 may merge the strokes of the user, the timestamp, the writing pressure of the user, and the graphical element selected in order to display the section 404 using the application 110.
[0064] In one embodiment, the template consumption module 218 may process the input received from the doctor on the digital writing template 114 to identify a commencement time of the operation. The operation may be tapping on send button 408n of the digital writing template 214. The template consumption module 218 may compare of the commencement time with a predefined threshold and trigger a notification to one or more users.
[0065] In one embodiment, the template consumption module 218 may process the input to identify commencement time when the user device 112 interacted with the send button 408n on the digital writing template 400. Based on the commencement time, one or more other users associated with the doctor may be configured to receive a notification stating that “the doctor has clicked on the send button 408n on the digital writing template 400”. In one embodiment, when the send button 408h is taped, the template consumption module 218 is configured to trigger a notification to the doctor’s receptionist requesting the receptionist to send the next patient. In another embodiment, the template consumption module 218 may determine delay in execution of the operation based on the comparison between the commencement time and predefined threshold. Furthermore, based on delay in execution of the operation, the notification may be triggered to other patients regarding the delay in operation triggered by the doctor. For example, the doctor may usually starts checking patients at 7pm. If the template consumption module 218 detects that the doctor has started operation the patients at 7:30pm based on analysis of input on the digital writing template 114, the template consumption module 218 may send a notification to other patients in queue that their appointment is delayed by half an hour.
[0066] In one embodiment, the template consumption module 218 may detect new inputs on one or more graphical elements from one or more users associated with the digital writing template 400. For example, when a junior doctor tries to write on the digital writing template 400, the template consumption module 218 is configured to identify if the junior doctor has the privilege to write over the digital writing template 400. If the junior doctor does not have the privilege to write on the template 400, the template consumption module 218 may trigger a notification to the doctor stating that of the digital writing template 400 is being used by an unauthorized person.
[0067] In one embodiment, the template consumption module 218 may process the input received from the doctor on the digital writing template 400 to identify one or more strokes, timestamp associated with each stroke, and audio captured at each stroke. Further, the template consumption module 218 may merge the strokes and audio captured at each stroke based on the timestamp associated with each stroke to generate a multimedia file. The multimedia file may be used by other users, such as pharmacist, path lab assistants, for better visualization of the user record.
[0068] Referring now to figure 5, a digital writing template 500 assigned to a transport agency for maintaining goods transportation record is illustrated. Initially, the transport agency may submit a template, designed as per their requirements, to the partner platform 108. The partner platform 108 may further communicate with the system 102 for converting the template into the digital writing template 500 and developing the application 110 for managing the record generation activity. In one embodiment, at the time of generating the digital writing template 500, the system 102 may assign operation to one or more graphical elements in the digital writing template 500 based on the inputs received from the transport agency. Further, the application 110 for assisting the digital writing process may also be generated using the system 102 based on the template and operation specified by the transport agency. Once the application 110 is developed, the application 110 is installed over the user device 104 of a clerk at the transport agency. Further, the digital writing template 500 may be printed and used by the clerk to capture record associated with transportation of goods.
[0069] In one embodiment, the digital writing template 500 may comprise sections 502, 504, and 506. The section 502 may enable recording information of consignor and consignee using graphical elements 502a such as consigner name, consignee name, mobile number GST number, Tin Number, PAN number and the like. Further, the section 504 may be designed to record information corresponding to goods that are transported. The section 504 may comprise graphical elements such as checkboxes 504a to record tax payment information. As soon as the clerk provides his inputs over the section 504, the template consumption module 218 may send this information to a logistics manager so that the transportation of the goods can be planned accordingly.
[0070] Further, the section 506 may comprise graphical elements 506a and 506b to capture signature of the customer and the clerk. The graphical element 506c may correspond to a soft button “send 506c” for sending the information to one or more other users in the transportation agency. A customer may access the application 110 to check the status of his goods. Furthermore, the information entered by the clerk on the digital writing template may be recorded in the form of the user record. The user record may be maintained in the database as described in figure 2 and figure 4.
[0071] Although implementations for systems and methods for digital writing exchange has been described, it is to be understood that the appended claims are not necessarily limited to the specific features or methods described. Rather, the specific features and methods are disclosed as examples of implementations for digital writing exchange.

Documents

Application Documents

# Name Date
1 201721044871-STATEMENT OF UNDERTAKING (FORM 3) [13-12-2017(online)].pdf 2017-12-13
2 201721044871-FORM 1 [13-12-2017(online)].pdf 2017-12-13
3 201721044871-FIGURE OF ABSTRACT [13-12-2017(online)].jpg 2017-12-13
4 201721044871-DRAWINGS [13-12-2017(online)].pdf 2017-12-13
5 201721044871-COMPLETE SPECIFICATION [13-12-2017(online)].pdf 2017-12-13
6 201721044871-FORM-26 [29-12-2017(online)].pdf 2017-12-29
7 201721044871-PA [04-01-2018(online)].pdf 2018-01-04
8 201721044871-OTHERS [04-01-2018(online)].pdf 2018-01-04
9 201721044871-FORM28 [04-01-2018(online)].pdf 2018-01-04
10 201721044871-FORM-26 [04-01-2018(online)].pdf 2018-01-04
11 201721044871-FORM FOR STARTUP [04-01-2018(online)].pdf 2018-01-04
12 201721044871-ASSIGNMENT DOCUMENTS [04-01-2018(online)].pdf 2018-01-04
13 201721044871-8(i)-Substitution-Change Of Applicant - Form 6 [04-01-2018(online)].pdf 2018-01-04
14 201721044871-FORM-9 [20-02-2018(online)].pdf 2018-02-20
15 201721044871-FORM 18A [21-02-2018(online)].pdf 2018-02-21
16 ABSTRACT1.jpg 2018-08-11
17 201721044871-ORIGINAL UNDER RULE 6 (1A)-ASSIGNMENT & FORM 26-090118.pdf 2018-08-11
18 201721044871-FER.pdf 2018-08-11
19 201721044871-OTHERS [15-09-2018(online)].pdf 2018-09-15
20 201721044871-FER_SER_REPLY [15-09-2018(online)].pdf 2018-09-15
21 201721044871-DRAWING [15-09-2018(online)].pdf 2018-09-15
22 201721044871-COMPLETE SPECIFICATION [15-09-2018(online)].pdf 2018-09-15
23 201721044871-CLAIMS [15-09-2018(online)].pdf 2018-09-15
24 201721044871-ABSTRACT [15-09-2018(online)].pdf 2018-09-15
25 201721044871-HearingNoticeLetter.pdf 2018-10-12
26 201721044871-REQUEST FOR ADJOURNMENT OF HEARING UNDER RULE 129A [01-11-2018(online)].pdf 2018-11-01
27 201721044871-ExtendedHearingNoticeLetter_07Dec2018.pdf 2018-11-06
28 201721044871-REQUEST FOR ADJOURNMENT OF HEARING UNDER RULE 129A [05-12-2018(online)].pdf 2018-12-05
29 201721044871-ExtendedHearingNoticeLetter_11Jan2019.pdf 2018-12-10
30 201721044871-Correspondence to notify the Controller (Mandatory) [02-01-2019(online)].pdf 2019-01-02
31 201721044871-Written submissions and relevant documents (MANDATORY) [14-01-2019(online)].pdf 2019-01-14
32 201721044871-Written submissions and relevant documents (MANDATORY) [15-01-2019(online)].pdf 2019-01-15
33 201721044871-Response to office action (Mandatory) [30-01-2019(online)].pdf 2019-01-30
34 201721044871-RELEVANT DOCUMENTS [30-01-2019(online)].pdf 2019-01-30
35 201721044871-PETITION UNDER RULE 137 [30-01-2019(online)].pdf 2019-01-30
36 201721044871-MARKED COPIES OF AMENDEMENTS [30-01-2019(online)].pdf 2019-01-30
37 201721044871-FORM 13 [30-01-2019(online)].pdf 2019-01-30
38 201721044871-AMMENDED DOCUMENTS [30-01-2019(online)].pdf 2019-01-30
39 201721044871-PatentCertificate01-03-2019.pdf 2019-03-01
40 201721044871-IntimationOfGrant01-03-2019.pdf 2019-03-01
41 201721044871-RELEVANT DOCUMENTS [31-03-2020(online)].pdf 2020-03-31
42 201721044871-RELEVANT DOCUMENTS [21-09-2021(online)].pdf 2021-09-21
43 201721044871-RELEVANT DOCUMENTS [31-08-2022(online)].pdf 2022-08-31
44 201721044871-RELEVANT DOCUMENTS [21-09-2023(online)].pdf 2023-09-21

Search Strategy

1 201721044871_SEARCH_21-03-2018.pdf

ERegister / Renewals

3rd: 22 Mar 2019

From 13/12/2019 - To 13/12/2020

4th: 13 Dec 2020

From 13/12/2020 - To 13/12/2021

5th: 13 Dec 2020

From 13/12/2021 - To 13/12/2022

6th: 13 Dec 2020

From 13/12/2022 - To 13/12/2023

7th: 13 Dec 2020

From 13/12/2023 - To 13/12/2024

8th: 13 Dec 2020

From 13/12/2024 - To 13/12/2025