Sign In to Follow Application
View All Documents & Correspondence

System And Method For Templatization Of Contract Documents

Abstract: ABSTRACT SYSTEM AND METHOD FOR TEMPLATIZATION OF CONTRACT DOCUMENTS The present disclosure provides a system and a method for creating a contract template from a given contract document. The method comprises providing a machine learning model trained on instances of entities in a set of contract documents. The method further comprises importing the given contract document, and displaying the given contract document in a document window of a template creation user interface. The method further comprises implementing the machine learning model for extracting entities in the displayed given contract document, defining each one of the extracted entities as one of field types for the displayed given contract document, replacing text corresponding to each one of the extracted entities with a respective space in the displayed given contract document, and linking each one of the spaces to the corresponding one of field types in a fields window of the template creation user interface, to create the contract template for the given contract document. FIG. 4

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
27 February 2023
Publication Number
35/2024
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
Parent Application

Applicants

QUOQO TECHNOLOGIES PRIVATE LIMITED
A-307 Brigade Omega, Banashankari VI Stage, Bangalore 560 062

Inventors

1. Chetan Nagendra
C/o Quoqo Technologies (P) Ltd., A-307 Brigade Omega, Banashankari VI Stage, Bangalore 560 062
2. Gurunath Gandikota
C/o Quoqo Technologies (P) Ltd., A-307 Brigade Omega, Banashankari VI Stage, Bangalore 560 062

Specification

Description:SYSTEM AND METHOD FOR TEMPLATIZATION OF CONTRACT DOCUMENTS

FIELD OF THE PRESENT DISCLOSURE
[0001] The present disclosure generally relates to application of natural language processing for contract documents management, and particularly to systems and methods for creating a contract template from a given contract document, specifically for document templatization, form creation, and document generation.

BACKGROUND
[0002] Document creation using standard templates, known as contract templatization, is a well-known and widely used industrial practice. Contract templatization refers to the process of creating a standardized template for a contract that can be used as the basis for creating multiple contracts with similar terms and conditions. It requires standard documents to be converted as templates which may later be used for generating new contract documents by filling in the relevant information. The goal of contract templatization is to make it easier to create contracts quickly and efficiently, while also ensuring that the contracts are consistent and adhere to the organization's legal and business standards. Contract templatization can save time by allowing organizations to create contracts quickly and efficiently. When a contract template is available, it can be used as the basis for creating multiple contracts with similar terms and conditions. This eliminates the need to start from scratch each time a contract is needed, which can save a significant amount of time. Additionally, using a contract template ensures that the contracts produced by the organization are consistent and adhere to the organization's legal and business standards. This can help to avoid errors or disputes that may arise from inconsistencies in the contracts.
[0003] There are some platforms available commercially which make available standard forms for document creation. However, the templates used on the platforms are prepared manually and converted into forms. In particular, the problem is solved by manually identifying fields in a word document, generating machine identifiable format for identifying fields, reading templates and then executing those. This generally entails using a word processing application such as Microsoft Word, and is a highly manual process. In particular, this may involve reviewing the existing contract to understand its terms and conditions and to identify any elements that may need to be customized for specific contracts; insert blanks or comments into the template as needed to allow for customization for specific contracts. This way of creating the contract template using the word processing application can be time-consuming and may require the involvement of multiple people, including legal counsel and business stakeholders. Automating the process of contract templatization can help to streamline and simplify the process, making it more efficient and effective.
[0004] Therefore, in light of the foregoing discussion, there exists a need to overcome problems associated with the traditional contract templatization process, and provide a framework which is an end-to-end solution for contract templatization.

SUMMARY
[0005] In an aspect, a system for creating a contract template from a given contract document is disclosed. The system comprises a machine learning model trained on instances of entities in a set of contract documents. The system further comprises a template creation user interface. The template creation user interface is configured to enable importing of the given contract document; and displaying the given contract document in a document window thereof. The system further comprises a processing unit. The processing unit is configured to: implement the machine learning model for extracting entities in the displayed given contract document; define each one of the extracted entities as one of field types for the displayed given contract document; replace text corresponding to each one of the extracted entities with a respective space in the displayed given contract document; and link each one of the spaces to the corresponding one of field types in a fields window of the template creation user interface, to create the contract template for the given contract document.
[0006] In one or more embodiments, the machine learning model is trained based on Named Entity Recognition (NER) technique.
[0007] In one or more embodiments, the machine learning model is trained on the instances of entities in an IOB (inside, outside, beginning) form or similar tagging techniques.
[0008] In one or more embodiments, the template creation user interface provides a workflow button to enable a user to define a workflow by identifying steps involved in generation of a contract document from the created contract template.
[0009] In one or more embodiments, the template creation user interface further provides one or more of: an insert signature button to allow for inserting box(es) for signature(s) of involved parties in a contract document to be generated from the created contract template; an insert stamp pad button to allow for inserting box(es) for stamp(s) of involved parties in a contract document to be generated from the created contract template; and an add live photograph button to allow for inserting a picture of the user as a photo capture field in a contract document to be generated from the created contract template.
[0010] In one or more embodiments, the template creation user interface further provides a template name box to enable a user to assign a name to the created contract template.
[0011] In one or more embodiments, the system further comprises a form generation user interface configured to implement the created contract template for generating a contract document therefrom.
[0012] In another aspect, a method for creating a contract template from a given contract document. The method comprises: providing a machine learning model trained on instances of entities in a set of contract documents; importing the given contract document; displaying the given contract document in a document window of a template creation user interface; implementing the machine learning model for extracting entities in the displayed given contract document; defining each one of the extracted entities as one of field types for the displayed given contract document; replacing text corresponding to each one of the extracted entities with a respective space in the displayed given contract document; and linking each one of the spaces to the corresponding one of field types in a fields window of the template creation user interface, to create the contract template for the given contract document.
[0013] In one or more embodiments, the method further comprises training the machine learning model based on Named Entity Recognition (NER) technique.
[0014] In one or more embodiments, the method further comprises training the machine learning model on the instances of entities in an IOB (inside, outside, beginning) form.

BRIEF DESCRIPTION OF THE FIGURES
[0015] For a more complete understanding of example embodiments of the present disclosure, reference is now made to the following descriptions taken in connection with the accompanying drawings in which:
[0016] FIG. 1 illustrates a system that may reside on and may be executed by a computer, which may be connected to a network, in accordance with one or more exemplary embodiments of the present disclosure;
[0017] FIG. 2 illustrates a diagrammatic view of a server, in accordance with one or more exemplary embodiments of the present disclosure;
[0018] FIG. 3 illustrates a diagrammatic view of a user device, in accordance with one or more exemplary embodiments of the present disclosure;
[0019] FIG. 4 illustrate a representative user interface for creating a template from a given contract document, in accordance with one or more exemplary embodiments of the present disclosure; and
[0020] FIG. 5 illustrates a schematic of a modelling pipeline for generating a NER (named entity recognition) model for creating the template, in accordance with one or more exemplary embodiments of the present disclosure;
[0021] FIG. 6 illustrates a schematic of an inference pipeline for implementing the generated NER model as per the modelling pipeline of FIG. 5, in accordance with one or more exemplary embodiments of the present disclosure; and
[0022] FIG. 7 illustrates a representative user interface for generating a contract document from a contract template, in accordance with one or more exemplary embodiments of the present disclosure.

DETAILED DESCRIPTION
[0023] In the following description, for purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of the present disclosure. It will be apparent, however, to one skilled in the art that the present disclosure is not limited to these specific details.
[0024] Reference in this specification to “one embodiment” or “an embodiment” means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the present disclosure. The appearance of the phrase “in one embodiment” in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. Further, the terms “a” and “an” herein do not denote a limitation of quantity, but rather denote the presence of at least one of the referenced items. Moreover, various features are described which may be exhibited by some embodiments and not by others. Similarly, various requirements are described which may be requirements for some embodiments but not for other embodiments.
[0025] Furthermore, in the following detailed description of the present disclosure, numerous specific details are set forth in order to provide a thorough understanding of the present disclosure. However, it will be understood that the present disclosure may be practiced without these specific details. In other instances, well-known methods, procedures, components, and circuits have not been described in detail so as not to unnecessarily obscure aspects of the present disclosure.
[0026] The present disclosure proposes a framework which is an end-to-end solution for the whole contract templatization process. The framework allows users to upload standard documents, identify fields that need to be converted as editable entities, create forms using this information, make the forms available for document creation, allow for collaboration on the platform and finally execute them on the platform using e-signatures or digital signatures.
[0027] Embodiments described herein may be discussed in the general context of computer-executable instructions residing on some form of computer-readable storage medium, such as program modules, executed by one or more computers or other devices. By way of example, and not limitation, computer-readable storage media may comprise non-transitory computer-readable storage media and communication media; non-transitory computer-readable media include all computer-readable media except for a transitory, propagating signal. Generally, program modules include routines, programs, objects, components, data structures, etc., that perform particular tasks or implement particular abstract data types. The functionality of the program modules may be combined or distributed as desired in various embodiments.
[0028] Some portions of the detailed description that follows are presented and discussed in terms of a process or method. Although steps and sequencing thereof are disclosed in figures herein describing the operations of this method, such steps and sequencing are exemplary. Embodiments are well suited to performing various other steps or variations of the steps recited in the flowchart of the figure herein, and in a sequence other than that depicted and described herein. Some portions of the detailed descriptions that follow are presented in terms of procedures, logic blocks, processing, and other symbolic representations of operations on data bits within a computer memory. These descriptions and representations are the means used by those skilled in the data processing arts to most effectively convey the substance of their work to others skilled in the art. In the present application, a procedure, logic block, process, or the like, is conceived to be a self-consistent sequence of steps or instructions leading to a desired result. The steps are those utilizing physical manipulations of physical quantities. Usually, although not necessarily, these quantities take the form of electrical or magnetic signals capable of being stored, transferred, combined, compared, and otherwise manipulated in a computer system. It has proven convenient at times, principally for reasons of common usage, to refer to these signals as transactions, bits, values, elements, symbols, characters, samples, pixels, or the like.
[0029] In some implementations, any suitable computer usable or computer readable medium (or media) may be utilized. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. The computer-usable, or computer-readable, storage medium (including a storage device associated with a computing device) may be, for example, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, device, or any suitable combination of the foregoing. More specific examples (a non-exhaustive list) of the computer-readable medium may include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), an optical fibre, a portable compact disc read-only memory (CD-ROM), an optical storage device, a digital versatile disk (DVD), a static random access memory (SRAM), a memory stick, a floppy disk, a mechanically encoded device such as punch-cards or raised structures in a groove having instructions recorded thereon, a media such as those supporting the internet or an intranet, or a magnetic storage device. Note that the computer-usable or computer-readable medium could even be a suitable medium upon which the program is stored, scanned, compiled, interpreted, or otherwise processed in a suitable manner, if necessary, and then stored in a computer memory. In the context of the present disclosure, a computer-usable or computer-readable, storage medium may be any tangible medium that can contain or store a program for use by or in connection with the instruction execution system, apparatus, or device.
[0030] In some implementations, a computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. In some implementations, such a propagated signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. In some implementations, the computer readable program code may be transmitted using any appropriate medium, including but not limited to the internet, wireline, optical fibre cable, RF, etc. In some implementations, a computer readable signal medium may be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
[0031] In some implementations, computer program code for carrying out operations of the present disclosure may be assembler instructions, instruction-set-architecture (ISA) instructions, machine instructions, machine dependent instructions, microcode, firmware instructions, state-setting data, or either source code or object code written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C++ or the like. Java and all Java-based trademarks and logos are trademarks or registered trademarks of Oracle and/or its affiliates. However, the computer program code for carrying out operations of the present disclosure may also be written in conventional procedural programming languages, such as the "C" programming language, PASCAL, or similar programming languages, as well as in scripting languages such as JavaScript, PERL, or Python. In present implementations, the used language for training may be one of Python, Tensorflow, Bazel, C, C++. Further, decoder in user device (as will be discussed) may use C, C++ or any processor specific ISA. Furthermore, assembly code inside C/C++ may be utilized for specific operation. Also, ASR (automatic speech recognition) and G2P decoder along with entire user system can be run in embedded Linux (any distribution), Android, iOS, Windows, or the like, without any limitations. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user’s computer and partly on a remote computer or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user’s computer through a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the internet using an Internet Service Provider). In some implementations, electronic circuitry including, for example, programmable logic circuitry, field-programmable gate arrays (FPGAs) or other hardware accelerators, micro-controller units (MCUs), or programmable logic arrays (PLAs) may execute the computer readable program instructions/code by utilizing state information of the computer readable program instructions to personalize the electronic circuitry, in order to perform aspects of the present disclosure.
[0032] In some implementations, the flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of apparatus (systems), methods and computer program products according to various implementations of the present disclosure. Each block in the flowchart and/or block diagrams, and combinations of blocks in the flowchart and/or block diagrams, may represent a module, segment, or portion of code, which comprises one or more executable computer program instructions for implementing the specified logical function(s)/act(s). These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the computer program instructions, which may execute via the processor of the computer or other programmable data processing apparatus, create the ability to implement one or more of the functions/acts specified in the flowchart and/or block diagram block or blocks or combinations thereof. It should be noted that, in some implementations, the functions noted in the block(s) may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved.
[0033] In some implementations, these computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function/act specified in the flowchart and/or block diagram block or blocks or combinations thereof.
[0034] In some implementations, the computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed (not necessarily in a particular order) on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions/acts (not necessarily in a particular order) specified in the flowchart and/or block diagram block or blocks or combinations thereof.
[0035] Referring to example implementation of FIG. 1, there is shown a system 100 that may reside on and may be executed by a computer (e.g., computer 12), which may be connected to a network (e.g., network 14) (e.g., the internet or a local area network). Examples of computer 12 may include, but are not limited to, a personal computer(s), a laptop computer(s), mobile computing device(s), a server computer, a series of server computers, a mainframe computer(s), or a computing cloud(s). In some implementations, each of the aforementioned may be generally described as a computing device. In certain implementations, a computing device may be a physical or virtual device. In many implementations, a computing device may be any device capable of performing operations, such as a dedicated processor, a portion of a processor, a virtual processor, a portion of a virtual processor, portion of a virtual device, or a virtual device. In some implementations, a processor may be a physical processor or a virtual processor. In some implementations, a virtual processor may correspond to one or more parts of one or more physical processors. In some implementations, the instructions/logic may be distributed and executed across one or more processors, virtual or physical, to execute the instructions/logic. Computer 12 may execute an operating system, for example, but not limited to, Microsoft Windows; Mac OS X; Red Hat Linux, or a custom operating system. (Microsoft and Windows are registered trademarks of Microsoft Corporation in the United States, other countries or both; Mac and OS X are registered trademarks of Apple Inc. in the United States, other countries or both; Red Hat is a registered trademark of Red Hat Corporation in the United States, other countries or both; and Linux is a registered trademark of Linus Torvalds in the United States, other countries or both).
[0036] In some implementations, the instruction sets and subroutines of system 100, which may be stored on storage device, such as storage device 16, coupled to computer 12, may be executed by one or more processors (not shown) and one or more memory architectures included within computer 12. In some implementations, storage device 16 may include but is not limited to: a hard disk drive; a flash drive, a tape drive; an optical drive; a RAID array (or other array); a random-access memory (RAM); and a read-only memory (ROM).
[0037] In some implementations, network 14 may be connected to one or more secondary networks (e.g., network 18), examples of which may include but are not limited to: a local area network; a wide area network; or an intranet, for example.
[0038] In some implementations, computer 12 may include a data store, such as a database (e.g., relational database, object-oriented database, triplestore database, etc.) and may be located within any suitable memory location, such as storage device 16 coupled to computer 12. In some implementations, data, metadata, information, etc. described throughout the present disclosure may be stored in the data store. In some implementations, computer 12 may utilize any known database management system such as, but not limited to, DB2, in order to provide multi-user access to one or more databases, such as the above noted relational database. In some implementations, the data store may also be a custom database, such as, for example, a flat file database or an XML database. In some implementations, any other form(s) of a data storage structure and/or organization may also be used. In some implementations, system 100 may be a component of the data store, a standalone application that interfaces with the above noted data store and/or an applet / application that is accessed via client applications 22, 24, 26, 28. In some implementations, the above noted data store may be, in whole or in part, distributed in a cloud computing topology. In this way, computer 12 and storage device 16 may refer to multiple devices, which may also be distributed throughout the network.
[0039] In some implementations, computer 12 may execute application 20 for contract templatization (as discussed later in more detail). In some implementations, system 100 and/or application 20 may be accessed via one or more of client applications 22, 24, 26, 28. In some implementations, system 100 may be a standalone application, or may be an applet / application / script / extension that may interact with and/or be executed within application 20, a component of application 20, and/or one or more of client applications 22, 24, 26, 28. In some implementations, application 20 may be a standalone application, or may be an applet / application / script / extension that may interact with and/or be executed within system 100, a component of system 100, and/or one or more of client applications 22, 24, 26, 28. In some implementations, one or more of client applications 22, 24, 26, 28 may be a standalone application, or may be an applet / application / script / extension that may interact with and/or be executed within and/or be a component of system 100 and/or application 20. Examples of client applications 22, 24, 26, 28 may include, but are not limited to, a standard and/or mobile web browser, an email application (e.g., an email client application), a textual and/or a graphical user interface, a customized web browser, a plugin, an Application Programming Interface (API), or a custom application. The instruction sets and subroutines of client applications 22, 24, 26, 28, which may be stored on storage devices 30, 32, 34, 36, coupled to user devices 38, 40, 42, 44, may be executed by one or more processors and one or more memory architectures incorporated into user devices 38, 40, 42, 44.
[0040] In some implementations, one or more of storage devices 30, 32, 34, 36, may include but are not limited to: hard disk drives; flash drives, tape drives; optical drives; RAID arrays; random access memories (RAM); and read-only memories (ROM). Examples of user devices 38, 40, 42, 44 (and/or computer 12) may include, but are not limited to, a personal computer (e.g., user device 38), a laptop computer (e.g., user device 40), a smart/data-enabled, cellular phone (e.g., user device 42), a notebook computer (e.g., user device 44), a tablet (not shown), a server (not shown), a television (not shown), a smart television (not shown), a media (e.g., video, photo, etc.) capturing device (not shown), and a dedicated network device (not shown). User devices 38, 40, 42, 44 may each execute an operating system, examples of which may include but are not limited to, Android, Apple iOS, Mac OS X; Red Hat Linux, or a custom operating system.
[0041] In some implementations, one or more of client applications 22, 24, 26, 28 may be configured to effectuate some or all of the functionality of system 100 (and vice versa). Accordingly, in some implementations, system 100 may be a purely server-side application, a purely client-side application, or a hybrid server-side / client-side application that is cooperatively executed by one or more of client applications 22, 24, 26, 28 and/or system 100.
[0042] In some implementations, one or more of client applications 22, 24, 26, 28 may be configured to effectuate some or all of the functionality of application 20 (and vice versa). Accordingly, in some implementations, application 20 may be a purely server-side application, a purely client-side application, or a hybrid server-side / client-side application that is cooperatively executed by one or more of client applications 22, 24, 26, 28 and/or application 20. As one or more of client applications 22, 24, 26, 28, system 100, and application 20, taken singly or in any combination, may effectuate some or all of the same functionality, any description of effectuating such functionality via one or more of client applications 22, 24, 26, 28, system 100, application 20, or combination thereof, and any described interaction(s) between one or more of client applications 22, 24, 26, 28, system 100, application 20, or combination thereof to effectuate such functionality, should be taken as an example only and not to limit the scope of the disclosure.
[0043] In some implementations, one or more of users 46, 48, 50, 52 may access computer 12 and system 100 (e.g., using one or more of user devices 38, 40, 42, 44) directly through network 14 or through secondary network 18. Further, computer 12 may be connected to network 14 through secondary network 18, as illustrated with phantom link line 54. System 100 may include one or more user interfaces, such as browsers and textual or graphical user interfaces, through which users 46, 48, 50, 52 may access system 100.
[0044] In some implementations, the various user devices may be directly or indirectly coupled to communication network, such as communication network 14 and communication network 18, hereinafter simply referred to as network 14 and network 18, respectively. For example, user device 38 is shown directly coupled to network 14 via a hardwired network connection. Further, user device 44 is shown directly coupled to network 18 via a hardwired network connection. User device 40 is shown wirelessly coupled to network 14 via wireless communication channel 56 established between user device 40 and wireless access point (i.e., WAP) 58, which is shown directly coupled to network 14. WAP 58 may be, for example, an IEEE 802.11a, 802.11b, 802.11g, Wi-Fi, RFID, and/or Bluetooth (including Bluetooth Low Energy) device that is capable of establishing wireless communication channel 56 between user device 40 and WAP 58. User device 42 is shown wirelessly coupled to network 14 via wireless communication channel 60 established between user device 42 and cellular network / bridge 62, which is shown directly coupled to network 14.
[0045] In some implementations, some or all of the IEEE 802.11x specifications may use Ethernet protocol and carrier sense multiple access with collision avoidance (i.e., CSMA/CA) for path sharing. The various 802.11x specifications may use phase-shift keying (i.e., PSK) modulation or complementary code keying (i.e., CCK) modulation, for example, Bluetooth (including Bluetooth Low Energy) is a telecommunications industry specification that allows, e.g., mobile phones, computers, smart phones, and other electronic devices to be interconnected using a short-range wireless connection. Other forms of interconnection (e.g., Near Field Communication (NFC)) may also be used.
[0046] The system 100 may include a server (such as server 200, as shown in FIG. 2) for the contract templatization process. Herein, FIG. 2 is a block diagram of an example of the server 200 capable of implementing embodiments according to the present disclosure. In one embodiment, an application server as described herein may be implemented on exemplary server 200. In the example of FIG. 2, the server 200 includes a processing unit 205 (hereinafter, referred to as CPU 205) for running software applications (such as, the application 20 of FIG. 1) and optionally an operating system. As illustrated, the server 200 further includes a database 210 (hereinafter, referred to as memory 210) which stores applications and data for use by the CPU 205. Storage 215 provides non-volatile storage for applications and data and may include fixed disk drives, removable disk drives, flash memory devices, and CD-ROM, DVD-ROM or other optical storage devices. An optional user input device 220 includes devices that communicate user inputs from one or more users to the server 200 and may include keyboards, mice, joysticks, touch screens, etc. A communication or network interface 225 is provided which allows the server 200 to communicate with other computer systems via an electronic communications network, including wired and/or wireless communication and including an Intranet or the Internet. In one embodiment, the server 200 receives instructions and user inputs from a remote computer through communication interface 225. Communication interface 225 can comprise a transmitter and receiver for communicating with remote devices. An optional display device 250 may be provided which can be any device capable of displaying visual information in response to a signal from the server 200. The components of the server 200, including the CPU 205, memory 210, data storage 215, user input devices 220, communication interface 225, and the display device 250, may be coupled via one or more data buses 260.
[0047] In the embodiment of FIG. 2, a graphics system 230 may be coupled with the data bus 260 and the components of the server 200. The graphics system 230 may include a physical graphics processing unit (GPU) 235 and graphics memory. The GPU 235 generates pixel data for output images from rendering commands. The physical GPU 235 can be configured as multiple virtual GPUs that may be used in parallel (concurrently) by a number of applications or processes executing in parallel. For example, mass scaling processes for rigid bodies or a variety of constraint solving processes may be run in parallel on the multiple virtual GPUs. Graphics memory may include a display memory 240 (e.g., a framebuffer) used for storing pixel data for each pixel of an output image. In another embodiment, the display memory 240 and/or additional memory 245 may be part of the memory 210 and may be shared with the CPU 205. Alternatively, the display memory 240 and/or additional memory 245 can be one or more separate memories provided for the exclusive use of the graphics system 230. In another embodiment, graphics processing unit 230 includes one or more additional physical GPUs 255, similar to the GPU 235. Each additional GPU 255 may be adapted to operate in parallel with the GPU 235. Each additional GPU 255 generates pixel data for output images from rendering commands. Each additional physical GPU 255 can be configured as multiple virtual GPUs that may be used in parallel (concurrently) by a number of applications or processes executing in parallel, e.g., processes that solve constraints. Each additional GPU 255 can operate in conjunction with the GPU 235, for example, to simultaneously generate pixel data for different portions of an output image, or to simultaneously generate pixel data for different output images. Each additional GPU 255 can be located on the same circuit board as the GPU 235, sharing a connection with the GPU 235 to the data bus 260, or each additional GPU 255 can be located on another circuit board separately coupled with the data bus 260. Each additional GPU 255 can also be integrated into the same module or chip package as the GPU 235. Each additional GPU 255 can have additional memory, similar to the display memory 240 and additional memory 245, or can share the memories 240 and 245 with the GPU 235. It is to be understood that the circuits and/or functionality of GPU as described herein could also be implemented in other types of processors, such as general-purpose or other special-purpose coprocessors, or within a CPU.
[0048] The system 100 may also include a user device 300 (as shown in FIG. 3). In embodiments of the present disclosure, the user device 300 may embody a smartphone, a personal computer, a tablet, or the like. Herein, FIG. 3 is a block diagram of an example of the user device 300 capable of implementing embodiments according to the present disclosure. In the example of FIG. 3, the user device 300 includes a processing unit 305 (hereinafter, referred to as CPU 305) for running software applications (such as, the application 20 of FIG. 1) and optionally an operating system. A user input device 320 is provided which includes devices that communicates user inputs from one or more users and may include keyboards, mice, joysticks, touch screens, and/or microphones. Further, a network interface 325 is provided which allows the user device 300 to communicate with other computer systems (e.g., the server 200 of FIG. 2) via an electronic communications network, including wired and/or wireless communication and including the Internet. The user device 300 may also include a decoder 355 may be any device capable of decoding (decompressing) data that may be encoded (compressed). A display device 350 may be provided which may be any device capable of displaying visual information, including information received from the decoder 355. In particular, as will be described below, the display device 350 may be used to display visual information received from the server 200 of FIG. 2. The components of the user device 300 may be coupled via one or more data buses 360.
[0049] It may be seen that compared to the server 200 in the example of FIG. 2, the user device 300 in the example of FIG. 3 may have fewer components and less functionality. However, the user device 300 may include other components, for example, in addition to those described above. In general, the user device 300 may be any type of device that has one or more of display capability and the capability to receive inputs from a user and send such inputs to the server 200. However, it may be appreciated that the user device 300 may have additional capabilities beyond those just mentioned.
[0050] The system 100 of the present disclosure provides a framework which is an end-to-end solution for the contract templatization process. The framework allows users to upload or import a given contract document (interchangeably referred to as “contract document,” “standard document,” or simply “document” without any limitations) that needs to be templatized, identify fields that need to be converted as editable entities, create contract template using this information, and make the contract template available for further document generation therefrom. The framework also allows for collaboration on the platform for the template creation process and/or the document generation process. In some examples, the framework further allows for execution of the generated documents on the platform itself, using tools like e-signatures, digital signatures and/or stamps.
[0051] Referring to FIG. 4, illustrated is representative of a template creation user interface (as represented by reference numeral 400), as part of the system 100, for creating a template from a given contract document, in accordance with one or more exemplary embodiments of the present disclosure. The template creation user interface 400, as provided, may be implemented in the user device 300 as described. As shown, the template creation user interface 400 provides an upload button 402a. The given contract document that needs to be templatized may be uploaded (and thereby imported) using the upload button 402a. In another example, the system 100 may allow the given contract document to be imported from a storage, such as the storage 215, which may be a company’s internal database, a cloud service, or the like. In an embodiment, the system 100 may also enable to edit an already available contract template document using the proposed framework. For this purpose, the template creation user interface 400 provides an edit template button 402b which may allow to import (or uploaded) a contract template document, for example, from the storage 215 or the like. It may be appreciated that the contract template document may be imported by performing a search in the storage 215 or by navigating through directories in the storage 215 of the system 100.
[0052] As shown, the template creation user interface 400 also provides a template name box 404. Herein, the template to be generated may be assigned a desired name using the template name box 404. It may be appreciated that the said desired name may be assigned by a user as per his/her requirements. Typically, the assigned name is selected to reflect the essence of the template to be generated that will produce the relevant contract documents when implemented. Further, as shown, the template creation user interface 400 provides a document window 406. Herein, once the given contract document may have been uploaded/imported, the system 100 configures the template creation user interface 400 to load the given contract document in a document window 406 for perusal of the user, such as for performing further actions thereon as discussed in the proceeding paragraphs.
[0053] It may be understood that the contract documents even of the same type may have many fields that may vary from one contract to another, like party name, date of commencement, party address, purpose, jurisdiction, term etc., and thus need to be identified to create a template for the said contract document type. In one embodiment, the fields to be created may be identified in the given contact document by making manual selection(s) in the loaded contract document in the document window 406. In such case, a name of each of the manually created fields is assigned to one of field spaces, which may be in the form of dropdown menus, in a fields window 408 of the template creation user interface 400. Once the fields are created, the template creation user interface 400 may allow to highlight text (terms) corresponding to selected field in the loaded document in the document window 406. It may be appreciated that the present template creation user interface 400 with the available fields window 408 may allow to perform the described manual process of identifying fields, which may ultimately be used for creating contract document template, in a user-friendly and intuitive manner, which is not the case with conventionally known contract templatization software and services.
[0054] In a preferred embodiment of the present disclosure, the system 100 includes a machine learning model for automatic identification of the said fields in the given contract document. In other words, the fields for the contract template to be created from the given contract document may be identified automatically by using the machine learning model. In an example, the machine learning model is based on an NER (named entity recognition) technique, and thus hereinafter is also referred to as NER model without any limitations. Herein, the NER model is used to extract the entities for the uploaded documents. Some of the entities of interest may include: party1, party2, effective date which are available in the party clause of a contract, contract type, which may, for example, be available in the heading of the document and the term/termination clause of the contract document.
[0055] Referring to FIG. 5, illustrated is a schematic of a modelling pipeline (as represented by reference numeral 500) for generating an NER model, as per embodiments of the present disclosure. For purpose of the present disclosure, the system 100 may be configured to allow a user to define multiple entities from the displayed text of the contract document (such as, in the document window 406 of the template creation user interface 400), and subsequently train the machine learning model to determine entities complementary to the defined entity in the given contract document. In an example, separate models may be created to identify each or a group of these entities by training the corresponding NER models. In particular, the NER model is trained by using data with entities in an IOB form (short for inside, outside, beginning) extracted from various documents, or similar tagging techniques. In the present examples, the machine learning model may be trained using a variety of NER algorithms. Further, the process may be iterated multiple times till a desired level of accuracy is achieved.
[0056] As shown, at step 502, the modelling pipeline 500 involves using a training corpus of various types of contract documents (as may be manually identified). In the present examples, the training corpus dataset may need a large number of documents, typically more than 600 per entity, in order to deal with data imbalance. At step 504, the modelling pipeline 500 involves using an annotation tool, such as Prodigy, Inception, Docanno, Brat, etc., to come up with IOB labels (which are a format for chunks, and are similar to part-of-speech tags but can denote the inside, outside, and beginning of a chunk) for the used training corpus. At step 506, the modelling pipeline 500 involves generating the NER model using the annotated training corpus by implementing rule based models, Lexicon based models or machine learning techniques including, but not limited to, CRFs (conditional random fields), BiLSTM (Bidirectional LSTM), CNN, ElMo (Embeddings from Language Model), Stanford NLP, or other transformer-based models (e.g., BERT, GPT3), etc., without any limitations.
[0057] According to embodiments of the present disclosure, once the models are trained, the entity extraction is carried out on relevant clauses in the given contract document using an inference pipeline (as discussed in the proceeding paragraphs). Herein, the given contract document, as being processed, is first loaded in the document window 406 of the template creation user interface 400. The system 100 utilizes the processing unit 205 (as described) for performing the processing steps described hereinafter.
[0058] Referring to FIG. 6, illustrated is a schematic of an inference pipeline (as represented by reference numeral 600) for implementing the generated NER model (as described in reference to the modelling pipeline 500 of FIG. 5) for entities identification in the given contract document. Herein, at step 602, the inference pipeline 600 involves extracting clauses from the given contract document. The process of extracting clauses from the given contract document may be achieved by utilizing a clause classification/extraction model (not described herein for brevity of the present disclosure). Further, at step 604, the inference pipeline 600 involves implementing the NER model (as described in reference to FIG. 5) for identifying and thereby extracting entities in the text of the extracted clauses of the given contract document. In the present implementation, the text in the given contract document may be pre-processed as per the requirement of the NER model. Consequently, an output of the inference pipeline 600 is in the form of extracted entities details for the given contract document (as being represented by block 606 in FIG. 6). As may be appreciated, the entities details may include information like parties, term, jurisdiction, effective date, type of the contract document, etc.
[0059] It may now be appreciated that the identified entities, or specifically text corresponding to each one of the identified entities, in the given contact document may be replaced with spaces in the form of blanks or the like. Subsequently, such spaces may be linked to corresponding field types in the fields window 408 of the template creation user interface 400, to effectively create a contract template for the given contract document. For this purpose, each one the extracted entities may be defined as fields for the given contract document. As may be understood that each of the identified entities (which may be converted into fields) may have associated properties (field type), like “date” for effective date, “text” for party name, “integer” for amount, etc. Such properties for the various identified entities may be automatically determined, and the corresponding field types may be set as such in the fields window 408 of the template creation user interface 400. Herein, for instance, the properties of the fields (i.e., field types) may be set as of different types like integer, float, date, text, etc. Furthermore, properties of the fields may be set as a dropdown (for standard lists like jurisdictions, country names, etc.), calendar (for dates), etc. In some embodiments, the template creation user interface 400 may also be configured to allow for the field types (as automatically created) to be manually edited, deleted, or modified, if required.
[0060] Referring back to FIG. 4, as shown, the contract template may be provided with one or more boxes 410a, 410b to be included therein. Herein, the one or more boxes 410a, 410b may allow for inserting signatures/stamps of the involved parties in the contract document to be generated from the contract template. Such boxes may be inserted on certain page(s) or all of the pages of the contract template as required. As may be contemplated, the boxes 410a, 410b may be inserted at desired location(s) in the contract template and the number of the boxes 410a, 410b may depend on number of parties that may be involved in execution of the contract document as may be generated from the contract template being processed. For this purpose, as illustrated, the template creation user interface 400 may provide an insert signature button 412. The insert signature button 412 may allow for inserting box(es) for signature(s) of the involved parties in the contract document to be generated from the created contract template. Also, as illustrated, the template creation user interface 400 may provide an insert stamp pad button 414. The insert stamp pad button 414 may allow for inserting box(es) for stamp(s) of the involved parties in the contract document to be generated from the created contract template. In some examples, as illustrated, the template creation user interface 400 may also provide as add live photograph button 416. Herein, the add live photograph button 416 may be configured to utilize a camera associated with the user device 300 (as described) to click a picture of the user and further allow to insert the said picture as a photo capture field in the contract document to be generated from the created contract template.
[0061] In some embodiments, the system 100 may configure the template creation user interface 400 to provide a workflow button 418. The workflow button 418 may allow the user to define a workflow by identifying various steps involved in finalization of the contract document to be generated from the contract template including, but not limited to, setting names/emails of people the document flows to, notifications that need to be sent out, etc. In some examples, the contract document can be set up to include webhooks to manage how the contract document is generated from the contract template being processed, and subsequently be shared with other applications, like a contract documents datastore of a same enterprise or the like. Such details may be contemplated by a person skilled in the art and thus not further described herein for brevity of the present disclosure.
[0062] It may be appreciated that in all of the above described processes, the clauses in the contact document being processed can be dragged and dropped, re-arranged as required in the document window 406 by the user. Further, as illustrated in FIG. 4, the template creation user interface 400 may provide a save template button 420, which in turn may allow to save the created contract template (as finalized so far). It may be understood that when the save template button 420 is clicked, the fields identified in the contract document being processed are replaced by an identifiable string in the backend (for example, a party field may be replaced with a string like “#@#Party1#@#” or “{{ Party1 }}). Now, the created (saved) contract template may be used as a template document for generating other contract documents of the same types, as discussed in the proceeding paragraphs. In some examples, as illustrated in FIG. 4, the template creation user interface 400 may further allow users to download a copy of the created contract document template, for reference and perusal of the user, by providing a download template button 422.
[0063] Now referring to FIG. 7, illustrated is representative of a form generation user interface (as represented by reference numeral 700) for generating a contract document from a contract template, in accordance with one or more exemplary embodiments of the present disclosure. The contract template as implemented herein may be one of the created contract template, as described in the preceding paragraphs. For this purpose, the fields (as defined) are recognized by an algorithm to create forms. One way to create forms is to use pdf form generators that take the identifiers and create blank boxes (as generally represented by reference numeral 702) as shown in FIG. 7. Herein, the form entities are set with the corresponding properties of the fields (field types). For example, a date field may be set up as a calendar, a standard list is set up as a dropdown, and the like.
[0064] Herein, the contract document may be generated using the created contract document template by utilizing one of two different approach. One of the approaches involves providing details related to the various fields in a fields window (herein, represented by reference numeral 704) of the form generation user interface 700. Further, post filling the fields in the fields window 704, the said approach may involve clicking on a fill document button 706 beside the fields window 704 of the form generation user interface 700. By doing so the corresponding boxes in the form get filled and finalized. The other approach involves directly filling the boxes 702 (corresponding to said spaces in the created contract template) in a document window (herein, represented by reference numeral 708) of the form generation user interface 700. When this is done, the fields in the fields window 704 get automatically filled and finalized.
[0065] In some examples, as illustrated, the form generation user interface 700 may provide one or more boxes 710a, 710b in the document window 708 which may correspond to the one or more boxes 410a, 410b of the corresponding created template. Herein, the one or more boxes 710a, 710b may allow for inserting signature(s) and/or stamp print(s) of the involved parties in the generated contract document, for example, the parties involved in execution of the contract document being generated. The generated contract document can be signed by using an add signature button 716 of the form generation user interface 700. In an example, the said signature(s) and/or the said stamp print(s) may be inserted manually by uploading corresponding images or using tools like stylus or touchscreen, or be inserted automatically from already uploaded corresponding images. In another example, this may be achieved either by e-signing (i.e., by uploading a signature, by signing using a touchpad or a mobile phone by scanning a QR code, or by digitally signing the document using digital authentication).
[0066] Also, in some examples, as illustrated, the form generation user interface 700 may provide an add stamp paper button 712 to enable a user to upload and attach a stamp paper. The attached stamp paper may be inserted in the contract document being generated at relevant place based on predefined criteria or the like. Further, in some examples, as illustrated, the form generation user interface 700 may provide an insert ancillary documents button 714 to enable the user to attach additional support documents. The attached additional support documents may be inserted in the contract document being generated at relevant place based on predefined criteria or the like.
[0067] In some examples, the form generation user interface 700 may further allow the user to associate a workflow with the generated contract document by selecting one of pre-configured workflows from a dropdown menu 720 and then clicking a workflow button 722. The form generation user interface 700 may also have an option to collaborate by inviting clients to edit the generated contract document collaboratively using a video conferencing feature, as may be accessed by one or more meeting buttons 724. It may also be possible to allow other parties to edit all or only some of the fields (via programming, as desired).
[0068] The generated contract document may then be sent by clicking a send document button 718 of the form generation user interface 700, such as to a contract repository or a predefined database or the like, for example, for sharing with relevant parties. In some examples, once a document is sent out, the system 100 may also be configured to track a status of the generated contract document on a document archive page or the like. Such details may be contemplated by a person skilled in the art and thus not further described herein for brevity of the present disclosure.
[0069] The present disclosure also relates to a method for creating a contract template from a given contract document. FIG. 8 illustrates a flowchart listing steps involved in a method 800 for creating a contract template from a given contract document, in accordance with one or more exemplary embodiments of the present disclosure. Various embodiments and variants disclosed above, with respect to the aforementioned system 100 as per the first aspect, apply mutatis mutandis to the present method 800. It may be appreciated for the given purpose, the described components of the system 100 may be considered interconnected with each other, and the steps as described below for the method 800 are generally sequential in nature.
[0070] Herein, at step 802, the method 800 comprises providing a machine learning model trained on instances of entities in a set of contract documents. At step 804, the method 800 comprises importing the given contract document. At step 806, the method 800 comprises displaying the given contract document in a document window of a template creation user interface. At step 808, the method 800 comprises implementing the machine learning model for extracting entities in the displayed given contract document. At step 810, the method 800 comprises defining each one of the extracted entities as one of field types for the displayed given contract document. At step 812, the method 800 comprises replacing text corresponding to each one of the extracted entities with a respective space in the displayed given contract document. At step 814, the method 800 comprises linking each one of the spaces to the corresponding one of field types in a fields window of the template creation user interface, to create the contract template for the given contract document.
[0071] The present system and method enables the users to create a contract template from a given contract document in an intuitive and user-friendly manner. The present system 100 and associated method allow a user to create a standardized template and then customize it for specific contracts by inserting variables that can be automatically filled in with the relevant information. Contract automation platforms use artificial intelligence (AI) and machine learning algorithms to analyse and understand the language used in contracts. The contract templatization techniques, as disclosed, make it easier to create contracts quickly and efficiently, while also ensuring that the contracts are consistent and adhere to the organization's legal and business standards. The framework disclosed may also be utilized by the user for extracting information of their interest. Thereby, the present disclosure provinces an intuitive and quick way for users to be able to extract custom information from text data. The teachings of the present disclosure for contract templatization may be implemented and applicable in contract lifecycle management, interactive template creation, automatic field identification, and the like.
[0072] The embodiments disclosed herein can be implemented through at least one software program running on at least one hardware device and performing network management functions to control the elements. In particular, the elements shown in FIGS. 4-7 include blocks which can be at least one of a hardware device, or a combination of hardware device and software module.
[0073] The foregoing description of the specific embodiments will so fully reveal the general nature of the embodiments herein that others can, by applying current knowledge, readily modify and/or adapt for various applications such specific embodiments without departing from the generic concept, and, therefore, such adaptations and modifications should and are intended to be comprehended within the meaning and range of equivalents of the disclosed embodiments. It is to be understood that the phraseology or terminology employed herein is for the purpose of description and not of limitation. While the embodiments herein have been described in terms of preferred embodiments, those skilled in the art will recognize that the embodiments herein can be practiced with modification within the spirit and scope of the present disclosure. It is therefore intended that the scope of the patent rights be limited not by this detailed description, but rather by any claims that issue on an application based hereon. Accordingly, the disclosure of the embodiments is intended to be illustrative, but not limiting, of the scope of the patent rights, which is set forth in the following claims.
, Claims:WE CLAIM:
What is claimed is:
1. A system for creating a contract template from a given contract document, the system comprising:
a machine learning model trained on instances of entities in a set of contract documents;
a template creation user interface configured to:
enable importing of the given contract document; and
displaying the given contract document in a document window thereof; and
a processing unit configured to:
implement the machine learning model for extracting entities in the displayed given contract document;
define each one of the extracted entities as one of field types for the displayed given contract document;
replace text corresponding to each one of the extracted entities with a respective space in the displayed given contract document; and
link each one of the spaces to the corresponding one of field types in a fields window of the template creation user interface, to create the contract template for the given contract document.

2. The system as claimed in claim 1, wherein the machine learning model is trained based on Named Entity Recognition (NER) technique.

3. The system as claimed in claim 1, wherein the machine learning model is trained on the instances of entities in an IOB (inside, outside, beginning) form.

4. The system as claimed in claim 1, wherein the template creation user interface provides a workflow button to enable a user to define a workflow by identifying steps involved in generation of a contract document from the created contract template.

5. The system as claimed in claim 1, wherein the template creation user interface further provides one or more of:
an insert signature button to allow for inserting box(es) for signature(s) of involved parties in a contract document to be generated from the created contract template;
an insert stamp pad button to allow for inserting box(es) for stamp(s) of involved parties in a contract document to be generated from the created contract template; and
an add live photograph button to allow for inserting a picture of the user as a photo capture field in a contract document to be generated from the created contract template.

6. The system as claimed in claim 1, wherein the template creation user interface further provides a template name box to enable a user to assign a name to the created contract template.

7. The system as claimed in claim 1 further comprising a form generation user interface configured to implement the created contract template for generating a contract document therefrom.

8. A method for creating a contract template from a given contract document, the method comprising:
providing a machine learning model trained on instances of entities in a set of contract documents;
importing the given contract document;
displaying the given contract document in a document window of a template creation user interface;
implementing the machine learning model for extracting entities in the displayed given contract document;
defining each one of the extracted entities as one of field types for the displayed given contract document;
replacing text corresponding to each one of the extracted entities with a respective space in the displayed given contract document; and
linking each one of the spaces to the corresponding one of field types in a fields window of the template creation user interface, to create the contract template for the given contract document.

9. The method as claimed in claim 8 further comprising training the machine learning model based on Named Entity Recognition (NER) technique.

10. The method as claimed in claim 8 further comprising training the machine learning model on the instances of entities in an IOB (inside, outside, beginning) form.

Documents

Application Documents

# Name Date
1 202341013177-FORM FOR STARTUP [27-02-2023(online)].pdf 2023-02-27
2 202341013177-FORM FOR SMALL ENTITY(FORM-28) [27-02-2023(online)].pdf 2023-02-27
3 202341013177-FORM 1 [27-02-2023(online)].pdf 2023-02-27
4 202341013177-FIGURE OF ABSTRACT [27-02-2023(online)].pdf 2023-02-27
5 202341013177-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [27-02-2023(online)].pdf 2023-02-27
6 202341013177-EVIDENCE FOR REGISTRATION UNDER SSI [27-02-2023(online)].pdf 2023-02-27
7 202341013177-DRAWINGS [27-02-2023(online)].pdf 2023-02-27
8 202341013177-DECLARATION OF INVENTORSHIP (FORM 5) [27-02-2023(online)].pdf 2023-02-27
9 202341013177-COMPLETE SPECIFICATION [27-02-2023(online)].pdf 2023-02-27
10 202341013177-Proof of Right [01-06-2023(online)].pdf 2023-06-01
11 202341013177-RELEVANT DOCUMENTS [09-06-2023(online)].pdf 2023-06-09
12 202341013177-PETITION UNDER RULE 137 [09-06-2023(online)].pdf 2023-06-09
13 202341013177-FORM-26 [09-06-2023(online)].pdf 2023-06-09