Sign In to Follow Application
View All Documents & Correspondence

Systems And Methods For Generating An Autonomous Robotic Action Framework

Abstract: Systems and methods for generating an autonomous robotic action framework are provided. Traditional systems and methods fail to provide for a unified modeling framework for autonomous robots to integrate robotic entities or elements into complex robotic interactions or deployments. The embodiments for the disclosure generates the autonomous robotic action framework by modelling, by one or more hardware processors, a plurality of robotic knowledge bases by implementing a graph database querying technique; generating, based upon the modelling, a plurality of robotic action plans by implementing a knowledge base querying technique; performing, via a runtime execution module, a runtime simulation and a performance analysis of each of the plurality of robotic action plans by implementing the graph database querying technique; and implementing, via an adaptation monitoring module, one or more integrity constraints on the runtime simulations for generating the autonomous robotic action framework.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
29 November 2018
Publication Number
23/2020
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
ip@legasis.in
Parent Application
Patent Number
Legal Status
Grant Date
2023-09-12
Renewal Date

Applicants

Tata Consultancy Services Limited
Nirmal Building, 9th Floor, Nariman Point, Mumbai - 400021, Maharashtra, India

Inventors

1. KATTEPUR, Ajay
Tata Consultancy Services Limited, Gopalan Global Axis H Block, 18 & 19 & 20, KIADB Export Promotion Industrial Area, Whitefield, Bangalore - 560066, Karnataka, India

Specification

Claims:1. A method for generating an autonomous robotic action framework, the method comprising: modelling, by one or more hardware processors, a plurality of robotic knowledge bases by implementing a graph database querying technique, wherein each of the plurality of robotic knowledge bases comprise at least one robotic world model, a robotic capability, a robotic task description, a robotic algorithm and a robotic task template (401); generating, based upon the modelling, a plurality of robotic action plans by implementing a knowledge base querying technique, wherein each of the plurality of robotic action plans comprise a set of decomposed robotic sub-tasks generated via a design time action planning module for executing a complete robotic task (402); performing, via a runtime execution module, a runtime simulation and a performance analysis of each of the plurality of robotic action plans by implementing the graph database querying technique (403); and implementing, via an adaptation monitoring module, one or more integrity constraints on the runtime simulations for generating the autonomous robotic action framework, wherein the autonomous robotic framework comprises a framework to generate a set of executable tasks for a plurality of autonomous robots operating in a robotic environment, and wherein the one or more integrity constraints are implemented by a knowledge graph update integrity verification technique (404). 2. The method as claimed in claim 1, wherein each of the plurality of autonomous robots coordinate and analyze a runtime anomaly and exception detected during the runtime simulation. 3. The method as claimed in claim 1, wherein the step of generating the plurality of robotic action plans is preceded by capturing, based upon one or more property graphs, at least one relationship between the robotic world model, the robotic capability, the robotic task description, the robotic algorithm and the robotic task template, and wherein the one or more property graphs correspond to the plurality of robotic knowledge bases. 4. The method as claimed in claim 3, wherein the one or more property graphs are traversed for modelling the plurality of robotic knowledge bases by implementing a Domain Specific Language (DSL) technique. 5. The method as claimed in claim 1, wherein the step of generating the plurality of robotic action plans comprises triggering, based upon the modelling, one or more robotic perceptions via the design time action planning module. 6. The method as claimed in claim 5, wherein the one or more robotic perceptions process a runtime anomaly and exception detected during the runtime simulation via the design time action planning module. 7. The method as claimed in claim 1, wherein the set of decomposed robotic sub-tasks are mapped to at least one of the plurality of robotic knowledge bases for performing the runtime simulation of each of the plurality of robotic action plans. 8. The method as claimed in claim 7, wherein the mapping is performed based upon a representation of each robotic sub-task amongst the set of decomposed robotic sub-tasks, and wherein the representation comprises robotic actions, robotic targets, robotic components and robotic properties. 9. The method as claimed in claim 1, wherein the step of performing the one or more integrity constraints is preceded by monitoring, based upon the runtime simulation and the performance analysis, a runtime deployment of each robot via the adaptation monitoring module. 10. A system (100) for generating an autonomous robotic action framework, the system (100) comprising: a memory (102) storing instructions; one or more communication interfaces (106); and one or more hardware processors (104) coupled to the memory (102) via the one or more communication interfaces (106), wherein the one or more hardware processors (104) are configured by the instructions to: model a plurality of robotic knowledge bases by implementing a graph database querying technique, wherein each of the plurality of robotic knowledge bases comprise at least one robotic world model, a robotic capability, a robotic task description, a robotic algorithm and a robotic task template; generate, based upon the modelling, a plurality of robotic action plans by implementing a knowledge base querying technique, wherein each of the plurality of robotic action plans comprise a set of decomposed robotic sub-tasks generated via a design time action planning module (202) for executing a complete robotic task; perform, via a runtime execution module (203), a runtime simulation and a performance analysis of each of the plurality of robotic action plans by implementing the graph database querying technique; and implement, via an adaptation monitoring module (204), one or more integrity constraints on the runtime simulations for generating the autonomous robotic action framework, wherein the autonomous robotic framework comprises a framework to generate a set of executable tasks for a plurality of autonomous robots operating in a robotic environment, and wherein the one or more integrity constraints are implemented by a knowledge graph update integrity verification technique. 11. The system (100) as claimed in claim 10, wherein each of the plurality of autonomous robots coordinate and analyze a runtime anomaly and exception detected during the runtime simulation. 12. The system (100) as claimed in claim 10, wherein the one or more hardware processors (104) are configured to generate the plurality of robotic action plans by capturing, based upon one or more property graphs, at least one relationship between the robotic world model, the robotic capability, the robotic task description, the robotic algorithm and the robotic task template, and wherein the one or more property graphs correspond to the plurality of robotic knowledge bases. 13. The system (100) as claimed in claim 12, wherein the one or more hardware processors (104) are configured to traverse the one or more property graphs for modelling the plurality of robotic knowledge bases by implementing a Domain Specific Language (DSL) technique. 14. The system (100) as claimed in claim 10, wherein the one or more hardware processors (104) are configured to generate the plurality of robotic action plans by triggering, based upon the modelling, one or more robotic perceptions via the design time action planning module (202). 15. The system (100) as claimed in claim 14, wherein the one or more robotic perceptions process a runtime anomaly and exception detected during the runtime simulation via the design time action planning module (202). 16. The system (100) as claimed in claim 10, wherein the one or more hardware processors (104) are configured to map the set of decomposed robotic sub-tasks to at least one of the plurality of robotic knowledge bases for performing the runtime simulation of each of the plurality of robotic action plans. 17. The system (100) as claimed in claim 16, wherein the mapping is performed based upon a representation of each robotic sub-task amongst the set of decomposed robotic sub-tasks, and wherein the representation comprises robotic actions, robotic targets, robotic components and robotic properties. 18. The system (100) as claimed in claim 10, wherein the one or more hardware processors (104) are configured to perform the one or more integrity constraints by monitoring, based upon the runtime simulation and the performance analysis, a runtime deployment of each robot via the adaptation monitoring module (204). , Description:FORM 2 THE PATENTS ACT, 1970 (39 of 1970) & THE PATENT RULES, 2003 COMPLETE SPECIFICATION (See Section 10 and Rule 13) Title of invention: SYSTEMS AND METHODS FOR GENERATING AN AUTONOMOUS ROBOTIC ACTION FRAMEWORK Applicant Tata Consultancy Services Limited A company Incorporated in India under the Companies Act, 1956 Having address: Nirmal Building, 9th floor, Nariman point, Mumbai 400021, Maharashtra, India The following specification particularly describes the invention and the manner in which it is to be performed. TECHNICAL FIELD The disclosure herein generally relates to robotics, and, more particularly, to systems and methods for generating an autonomous robotic framework. BACKGROUND Multi-robot systems and related technologies are being used widely in digital and automation era in a variety of applications and industries. For example, for a warehouse procurement system, the multi-robot systems are deployed to automate the process of storing and retrieving a wide variety of objects in and out of a warehouse. In a warehouse, good are systematically stored across a large area and are taken out of storage based upon the demand. Movement of goods or objects in and out of the warehouse poses a major challenge in terms of scheduling of tasks (relating to the movement to goods). Copious amount of demands flowing into the warehouse procurement system in real-time necessitates a demand for an optimum scheduling. With a rapid development of e-commerce and modern logistics, warehousing systems have a tremendous scale of inventory and a wide range of high demand vis-a-vis short-picking time, thereby demanding deployment of robotic systems. A fundamental characteristic required in robotic deployments is the ability of autonomous robotic devices to self-configure in dynamic goal and deployment conditions. Autonomous robots are being increasingly used in industrial deployments to solve problems via intelligent adaptive mechanisms. A central tenet in such deployments elicits efficient action plans that may be executed at runtime. For example, commercial robotic deployments have been made in warehouses to improve throughput of automated tasks. Due to above requirements, many autonomic computing techniques have been proposed to create self-aware robotic systems that respond to both high level goals as well as external stimuli. However, such traditional systems and techniques fail to provide for a unified modeling framework for autonomous robots, which can integrate robotic entities into complex industrial deployments. Further, such traditional systems and techniques providing robotics solutions have been specialized to a particular robotic application, domain, and selected structure. As a result, present robotics architectures are inherently monolithic, lack inter-operability, lack use of mainstream open standards, and end up being costly. SUMMARY Embodiments of the present disclosure present technological improvements as solutions to one or more of the above-mentioned technical problems recognized by the inventors in conventional systems. For example, in one embodiment, a method for generating an autonomous robotic action framework, the method comprising: modelling, by one or more hardware processors, a plurality of robotic knowledge bases by implementing a graph database querying technique, wherein each of the plurality of robotic knowledge bases comprise at least one robotic world model, a robotic capability, a robotic task description, a robotic algorithm and a robotic task template; generating, based upon the modelling, a plurality of robotic action plans by implementing a knowledge base querying technique, wherein each of the plurality of robotic action plans comprise a set of decomposed robotic sub-tasks generated via a design time action planning module for executing a complete robotic task; performing, via a runtime execution module, a runtime simulation and a performance analysis of each of the plurality of robotic action plans by implementing the graph database querying technique; implementing, via an adaptation monitoring module, one or more integrity constraints on the runtime simulations for generating the autonomous robotic action framework, wherein the autonomous robotic framework comprises a framework to generate a set of executable tasks for a plurality of autonomous robots operating in a robotic environment, and wherein the one or more integrity constraints are implemented by a knowledge graph update integrity verification technique; based upon one or more property graphs, at least one relationship between the robotic world model, the robotic capability, the robotic task description, the robotic algorithm and the robotic task template, and wherein the one or more property graphs correspond to the plurality of robotic knowledge bases and; monitoring, based upon the runtime simulation and the performance analysis, a runtime deployment of each robot via the adaptation monitoring module. In another aspect, there is provided a system for generating an autonomous robotic action framework, the system comprising a memory storing instructions; one or more communication interfaces; and one or more hardware processors coupled to the memory via the one or more communication interfaces, wherein the one or more hardware processors are configured by the instructions to: model a plurality of robotic knowledge bases by implementing a graph database querying technique, wherein each of the plurality of robotic knowledge bases comprise at least one robotic world model, a robotic capability, a robotic task description, a robotic algorithm and a robotic task template; generate, based upon the modelling, a plurality of robotic action plans by implementing a knowledge base querying technique, wherein each of the plurality of robotic action plans comprise a set of decomposed robotic sub-tasks generated via a design time action planning module for executing a complete robotic task; perform, via a runtime execution module, a runtime simulation and a performance analysis of each of the plurality of robotic action plans by implementing the graph database querying technique; implement, via an adaptation monitoring module, one or more integrity constraints on the runtime simulations for generating the autonomous robotic action framework, wherein the autonomous robotic framework comprises a framework to generate a set of executable tasks for a plurality of autonomous robots operating in a robotic environment, and wherein the one or more integrity constraints are implemented by a knowledge graph update integrity verification technique; generate the plurality of robotic action plans by capturing, based upon one or more property graphs, at least one relationship between the robotic world model, the robotic capability, the robotic task description, the robotic algorithm and the robotic task template, and wherein the one or more property graphs correspond to the plurality of robotic knowledge bases; and perform the one or more integrity constraints by monitoring, based upon the runtime simulation and the performance analysis, a runtime deployment of each robot via the adaptation monitoring module. In yet another aspect, there is provided one or more non-transitory machine readable information storage mediums comprising one or more instructions which when executed by one or more hardware processors causes the one or more hardware processors to perform a method for generating an autonomous robotic action framework, the method comprising: modelling, by one or more hardware processors, a plurality of robotic knowledge bases by implementing a graph database querying technique, wherein each of the plurality of robotic knowledge bases comprise at least one robotic world model, a robotic capability, a robotic task description, a robotic algorithm and a robotic task template; generating, based upon the modelling, a plurality of robotic action plans by implementing a knowledge base querying technique, wherein each of the plurality of robotic action plans comprise a set of decomposed robotic sub-tasks generated via a design time action planning module for executing a complete robotic task; performing, via a runtime execution module, a runtime simulation and a performance analysis of each of the plurality of robotic action plans by implementing the graph database querying technique; implementing, via an adaptation monitoring module, one or more integrity constraints on the runtime simulations for generating the autonomous robotic action framework, wherein the autonomous robotic framework comprises a framework to generate a set of executable tasks for a plurality of autonomous robots operating in a robotic environment, and wherein the one or more integrity constraints are implemented by a knowledge graph update integrity verification technique; based upon one or more property graphs, at least one relationship between the robotic world model, the robotic capability, the robotic task description, the robotic algorithm and the robotic task template, and wherein the one or more property graphs correspond to the plurality of robotic knowledge bases and; monitoring, based upon the runtime simulation and the performance analysis, a runtime deployment of each robot via the adaptation monitoring module. It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the invention, as claimed. BRIEF DESCRIPTION OF THE DRAWINGS The accompanying drawings, which are incorporated in and constitute a part of this disclosure, illustrate exemplary embodiments and, together with the description, serve to explain the disclosed principles. FIG. 1 illustrates a block diagram of a system for generating an autonomous robotic action framework, in accordance with some embodiments of the present disclosure. FIG. 2 is architectural diagram depicting components and flow of the system for generating the autonomous robotic action framework, in accordance with some embodiments of the present disclosure. FIG. 3 illustrates an example of an autonomous robot, in accordance with some embodiments of the present disclosure. FIG. 4 is a flow diagram illustrating the steps involved in the process of generating the autonomous robotic action framework, in accordance with some embodiments of the present disclosure. FIG. 5A through 5E illustrates graphically examples of property graphs for modelling a plurality of robotic knowledge bases, in accordance with some embodiments of the present disclosure. FIG. 6 illustrates graphically an example of a robotic action plan, in accordance with some embodiments of the present disclosure. FIG. 7 illustrates graphically multiple measurements of latency based upon knowledge base queries while simulating a hyper-connected graph traversal, in accordance with some embodiments of the present disclosure. DETAILED DESCRIPTION OF EMBODIMENTS Exemplary embodiments are described with reference to the accompanying drawings. In the figures, the left-most digit(s) of a reference number identifies the figure in which the reference number first appears. Wherever convenient, the same reference numbers are used throughout the drawings to refer to the same or like parts. While examples and features of disclosed principles are described herein, modifications, adaptations, and other implementations are possible without departing from the spirit and scope of the disclosed embodiments. It is intended that the following detailed description be considered as exemplary only, with the true scope and spirit being indicated by the following claims. Embodiments of the present disclosure provide systems and methods for generating an autonomous robotic action framework. Robots and related technologies are being used widely in digital and automation era in a variety of applications and industries. Robots are often used, for example, to perform repetitive manufacturing procedures. Robots have the ability, for example, to precisely, quickly, and repeatedly place, pick, solder, and tighten components. This can enable robots to improve product quality while reducing build time and cost. As a result, robots are well-adapted to perform repetitive procedures that human beings may find less than rewarding. There are many applications, domains, and resulting structures for robots. Examples range from and include unmanned autonomous robotic vehicles in a military domain, surveillance and security robots in a commercial domain, robotic manipulator arms in an industrial domain, medicinal transport robots in a professional service domain, vacuum cleaning robots in a home, legged entertainment robots for personal use, among many others. The mechanisms, electronics, sensors, actuators, and their interconnections all also vary across robots. Furthermore, architecture(s) or solution(s) that control the behavior of a robot also varies across robotics applications. Autonomous robots are being increasingly integrated into manufacturing, supply chain and retail industries due to the twin advantages of improved throughput and adaptivity. In order to handle complex industrial tasks, the autonomous robots require robust action plans that can self-adapt to runtime changes. Given a set of high level tasks, an autonomous robot must identify appropriate action plans to perform the task. Robots are intended to be autonomous, with adaptation seen for varying pick-up locations, product dimensions and rates of procurement. In order to successfully integrate robotic entities into complex industrial deployments, a unified modeling robotic framework is required. In terms of generating the unified modeling robotic framework, the traditional systems and methods suffer from various limitations such as performance degradation in case of large knowledge bases and maintaining relationship between various robotic elements (for example, robotic capabilities, task templates etc.). The method disclosed provides for overcoming the limitations faced by the traditional systems and methods. For example, the method disclosed provides for modelling of robotic knowledge bases. Further, the method disclosed provides for self-optimizing robotic tasks execution to generate the unified modeling robotic framework. Referring now to the drawings, and more particularly to FIG. 1 through 7, where similar reference characters denote corresponding features consistently throughout the figures, there are shown preferred embodiments and these embodiments are described in the context of the following exemplary system and/or method. FIG. 1 illustrates an exemplary block diagram of a system 100 for generating an autonomous robotic action framework, in accordance with an embodiment of the present disclosure. In an embodiment, the system 100 includes one or more processors 104, communication interface device(s) or input/output (I/O) interface(s) 106, and one or more data storage devices or memory 102 operatively coupled to the one or more processors 104. The one or more processors 104 that are hardware processors can be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, state machines, logic circuitries, and/or any devices that manipulate signals based on operational instructions. Among other capabilities, the processor(s) is configured to fetch and execute computer-readable instructions stored in the memory 102. In an embodiment, the system 100 can be implemented in a variety of computing systems, such as laptop computers, notebooks, hand-held devices, workstations, mainframe computers, servers, a network cloud and the like. The I/O interface device(s) 106 can include a variety of software and hardware interfaces, for example, a web interface, a graphical user interface, and the like and can facilitate multiple communications within a wide variety of networks N/W and protocol types, including wired networks, for example, LAN, cable, etc., and wireless networks, such as WLAN, cellular, or satellite. In an embodiment, the I/O interface device(s) can include one or more ports for connecting a number of devices to one another or to another server. The system 100, through the I/O interface 106 may be coupled to external data sources. The memory 102 may include any computer-readable medium known in the art including, for example, volatile memory, such as static random access memory (SRAM) and dynamic random access memory (DRAM), and/or non-volatile memory, such as read only memory (ROM), erasable programmable ROM, flash memories, hard disks, optical disks, and magnetic tapes. In an embodiment, the memory 102 can be configured to store any data that is associated with the generation of the autonomous robotic action framework. In an embodiment, the information pertaining to modelling, robotic action plans, runtime simulation and performance analysis, and the generated autonomous robotic action framework etc. is stored in the memory 102. Further, all information (inputs, outputs and so on) pertaining to the generating of the autonomous robotic action framework, may also be stored in the database, as history data, for reference purpose. According to an embodiment of the present disclosure, by referring to FIG. 2, the architecture of the system for generating the autonomous robotic action framework may be considered in detail. By referring to FIG. 2 again, it may be noted that the architecture comprises a Knowledge base module 201, a Design time action planning module 202, a Runtime execution module 203, an Adaptation monitoring module 204, and a Database integrity constraint check 205. The knowledge base module 201 facilitates modelling of robotic knowledge bases. The design time action planning module 202 decomposes robotic tasks into decomposed robotic sub-tasks. The runtime execution module 203 facilitates a simulation and an analysis of robotic action plan(s). The adaptation monitoring module 204 monitors runtime deployments of robots to estimate plan completion. The database integrity constraint check 205 comprises integrity rules (or constraints) to implement constraint on runtime simulation(s). FIG. 4, with reference to FIGS. 1 and 2, illustrates an exemplary flow diagram of a method for generating the autonomous robotic action framework. In an embodiment the system 100 comprises one or more data storage devices of the memory 102 operatively coupled to the one or more hardware processors 104 and is configured to store instructions for execution of steps of the method by the one or more processors 104. The steps of the method of the present disclosure will now be explained with reference to the components of the system 100 as depicted in FIG. 1 and the flow diagram. In the embodiments of the present disclosure, the hardware processors 104 when configured the instructions performs one or more methodologies described herein. Autonomous Robotic Framework - The term ‘autonomous robotic framework’ comprises a framework to generate a set of executable tasks for a plurality of autonomous robots operating in a robotic environment, and wherein one or more of the plurality of autonomous robots coordinate and analyze a runtime anomaly and exception detected during runtime simulation(s). As each of the plurality of autonomous robots are intended to be learning world models, knowledge bases are needed to populate information about the world, object, perception and action sequences needed. Any runtime anomalies may be dealt with through further queries and eventual exception handling. The robots are intended to be autonomous, as each robot is adapted for varying pick-up locations, product dimensions and rates of procurement, that is, each of the robot perceive it’s environment, make decisions based on what it perceives and/or has been programmed to recognize and then actuate a movement or manipulation within that environment. By referring to FIG. 3, an example of a Keller und Knappich Augsburg™ (KUKA) KUKA™ Mobile Robors (KMR) Quantec™ autonomous robot deployed in a warehouse may be referred. Thus, the ‘autonomous robotic framework’ comprises Knowledge Bases that efficiently capture relationships between world models, objects, robotic actions and robotic tasks; robotic action plans that may be decomposed from a high level robotic task, involving querying the knowledge base as well as triggering perceptions in case of a knowledge mismatch; robotic and related techniques to reconfigure robotic actions at runtime, when plans cannot be executed due to constraints; and rules for consistent updates to world model(s), which allows multiple robots to coordinate or analyze exceptions during execution. The process of generating the autonomous robotic action framework has been discussed infra via steps 401 through 404. According to an embodiment of the present disclosure, at step 401, the one or more hardware processors 104 are configured to model a plurality of robotic knowledge bases by implementing a graph database querying technique. Generally, as the robots are intended to be learning world models for performing complex industrial and other tasks, knowledge bases are needed to populate information about the world, object, perception and action sequences needed. Knowledge base (or a knowledge database) comprises an integral part of all autonomous / cognitive robotic architectures. Knowledge base is generally queried both at design time for action generation and at runtime for knowledge updates. In an embodiment, each of the plurality of robotic knowledge bases comprises at least one robotic world model, a robotic capability, a robotic task description, a robotic algorithm and a robotic task template. The one or more hardware processors 104 are configured to model each of the plurality of robotic knowledge bases via the knowledge base module 201. Further, each of the plurality of robotic knowledge bases may be modeled by implementing the graph database querying technique. In an embodiment, the graph database querying technique comprises using a plurality of property graphs, graph databases, a Gremlin graph query language (and ay combination thereof) to model the plurality of robotic knowledge bases. The process of modeling the plurality of robotic knowledge bases via the graph database querying technique may now be considered in detail. As mentioned supra, the graph database querying technique comprises using the plurality of property graphs. Each of the plurality of property graphs are attributed, labelled and directed graphs, and may be, in general, considered as an alternative to semantic ontologies and tuple data-stores that may generally be used in implementations, for example, Knowrab™ and CRAM™. In an embodiment, each of the plurality of robotic knowledge bases below mentioned knowledge graphs: World Models – The world models describes a robotic environment map and a corresponding layout, including robotic object locations; Object Templates – The object templates describe robotic target objects of interest comprising shape, size, color and location; Robotic Capabilities – The robotic capabilities provides for robot models, capabilities, sensors and actuators that are integrated to perform robotic tasks; Robotic Algorithms – The robotic algorithms provide for navigation(s), manipulations(s), task allocation algorithms that may be used with robotic allocations; and Robotic tasks templates – The robotic tasks templates define high level task requirements and corresponding set of outputs to the high level task requirements. In an embodiment, by referring to FIG. 5A to 5E, an example of one or more property graphs amongst the plurality of property graphs for modeling one or more world models, one or more robotic capabilities, one or more robotic task descriptions, one or more robotic algorithms and one or more robotic task templates may be referred. To describe properties between edges of the one or more property graphs, the method disclosed provides for a set of four relations, for example, isOftype, hasProperty, requires and produces. In an embodiment isOfType provides hierarchical sub-class relationships, hasProperty extends property descriptions using key–value pairs, requires provides pre-conditions to extract knowledge from the graph, produces provides post condition effects of executing a node. Each of the set of four relations may be queried by the one or more hardware processors 104 to extract information from each of the plurality of knowledge bases. In an embodiment, by referring to FIG, 5A, it may be noted that FIG. 5A illustrates graphically, capabilities of a Pick Robot that Robot Model, Capabilities and a Perception, it requires a Target, World Models and Algorithms and Produces the Pick, and Place Actions. FIG. 5B provides the robotic algorithms necessary for robotic executions comprising path planning, image template matching and grasp manipulation algorithms. FIG. 5C provides for explicit definition(s) for each robotic task to be executed, for example, for an instance with the place task, which may require a world model (amongst the world models), Target Object, Picker Robot, and produces Placed Object. In an embodiment, by referring to FIG, 5D, it may be noted that FIG. 5D provides graphically, an example of a warehouse world model comprising hasProperty Map and Object. In order to extract a property of an object, Location requires a map of the area. Finally, FIG. 5E provides properties of objects in the world model comprising Location of the object, Shape and Contour Map. By referring to FIG. 5A through 5E again, it may be noted that the graph database querying technique provides extensibility and reuse of information across multiple autonomous robotic deployments. According to an embodiment of the present disclosure, technical improvements and advantages of implementing the graph database querying technique for performing the modelling may be considered. Traditional techniques, for example, semantic ontologies typically store data in tuple data-stores that reduce expressivity provided in graph representations. Scalability is another hindrance in representation, update and query of large ontologies. Graph databases are emerging as an appropriate tool to model interconnectivity and topology of relationships among large knowledge data sets. The proposed graph database querying technique which implements the graph databases thus provides for keeping all information about an entity in a single node and show related information by arcs connected to it. Further, queries can refer directly to this graph structure, such as finding shortest paths or determining certain subgraphs, and the graph databases provide efficient storage structures for graphs, thus reducing computational complexity in operations. According to an embodiment of the present disclosure, the step of modelling comprises capturing, based upon the one or more property graphs, at least one relationship between the robotic world model, the robotic capability, the robotic task description, the robotic algorithm and the robotic task template, and wherein the one or more property graphs correspond to the plurality of robotic knowledge bases. Considering an example scenario, by referring to FIG. 5A through 5E yet again, relationships between the robotic world model, the robotic capability, the robotic task description, the robotic algorithm and the robotic task template may be referred, wherein robotic capabilities require robotic world model(s), task template(s) requires world model(s), and the like. Further, by referring to the algorithm below, it may be noted that the robotic world model, the robotic capability, the robotic task description, the robotic algorithm and the robotic task template interact with each other. In an embodiment, for implementing one or more of the plurality of property graphs, the method disclosed implements multi-modal OrientDB database. The OrientDB uses generic vertex persistence class V and a class for edges E. Unlike ontologies that store data using triple stores, the graph databases maintain a graphical structure with vertices and edges. In a graph data model, nodes are physically connected to each other via pointers, thus enabling complex queries to be executed faster and more effectively than in a relational data model. Properties are represented as Key–Value pairs that may be queried. In an example implementation, a graph database of the world model in FIG. 4D with vertices, edges and properties in the OrientDB may be represented as below: Gremlin>g.V.map =>{Name=WorldModel} =>{Name=Objects, Properties=ObjectProperties, Location=ObjectLocation} =>{Name=Warehouse} =>{Name=Map, Rack=RackConfig, Layout-WarehouseLayout, Aisles=AislesConfig} Gremlin>g.E =>e[#26:0][#10:0-isOfType->#9:0] =>e[#96:0][#10:0-hasProperty->#11:0] =>e[#30:0][#10:0-hasProperty->#9:1] =>e[#33:0][#9:1-requires->#11:0] In an embodiment, each of the plurality of property graphs may be traversed and manipulated for modelling the plurality of robotic knowledge bases by implementing a Domain Specific Language (DSL) technique. The method disclosed facilitates performing the manipulation and the traversing via below set of queries: Manipulation: Filtering – The filtering query filters vertices or edges of a property graph based upon defined property labels. Considering an example scenario, the query g.v().Name may be executed to filter properties such as Name of a vertex; and Complex Queries – Queries may combine multiple vertices, edges and properties. Queries may also provide range or equality constraints to numeric property values. For instance, the complex query g.V.has(‘Name’,’Warehouse’).out’’hasProperty’).map matches the vertex with property key–value pair (Name, Warehouse), output edge with property hasProperty and produces an output of the vertices. Traversing: Considering an example scenario for traversing, the query g.v().outE.inV.name.path traverses the output edges (out E) of a vertex, and generates a path traversed. According to an embodiment of the present disclosure, at step 402, the one or more hardware processors 104 are configured to generate, based upon the modelling, a plurality of robotic action plans by implementing a knowledge base querying technique. Each of the plurality of robotic action plans comprise a set of decomposed robotic sub-tasks generated via the design time action planning module 202 for executing a complete robotic task. Thus, the design time action planning module 202 decomposes one or more robotic tasks into the set of decomposed robotic sub-tasks, and applies the knowledge base querying technique to execute the complete task. The process of generating the plurality of robotic action plans by implementing the knowledge base querying technique may now be considered in detail. In an embodiment, for generating or implementing robotic action plans, the knowledge base querying technique implements a formal specification programming language, Orc. As is known in the art, the Orc concurrent programming language is grounded on a formal process-calculi to specify complex distributed computing patterns. The execution of programs in Orc makes use of expressions, with the atomic abstraction being a site. To create complex expressions based on site invocations, the Orc employs the following Combinators: Parallel Combinator (|) – Given two sites s_1 and s_2, the expression s_1 |s_2 invokes both sites s_1 and s_2 in parallel; Sequential Combinator (>x>,>>) – In the expression s_1>x>s_2 (that is, shorthand s_1»s_2), site s_1 is evaluated initially, with every value published in by s_1 initiating a separate execution of site s_2; Pruning Combinator (m> append ([v].m) >q> World_model:=q --Gremlin Query site Def class gremlin ()= Def find (v,D) = g.V.has (v,D) .map >v> Append_model(v) Def hasPropoerty (v,D) = g.V.has (v,D) .outE (‘hasProperty’).inV .map >v> append_model (v) Def requires (v,D) = g.V.has (v,D) .outE (‘requires’).inV .map >v> append_model (v) Def IsOfType (v,D) = g.V.has (v,D) .outE (‘IsOfType’).inV .map >v> append_model (v) Def produces (v,D) = g.V.has (v,D) .outE (‘produces’).inV .map >v> append_model (v) Stop --Searching Dependencies for “rack” value gremlin = gremlin () gremlin.find (‘Name”.”rack”) | gremlin. hasProperty (“Name”,”Objects”) | gremlin.requires (“Name”,”Map”) | gremlin.isOfType (“Name”.”WorldlModel”) >> World_model ? In an embodiment, each of the plurality of robotic action plans may be executed with high-level robotic tasks enacted through decomposition. By referring to below Action Planner query, it may be noted that queries may made to each of the plurality of robotic knowledge bases to determine if the query terms are located in the world or model targets. The absence of the query terms triggers the one or more robotic perceptions. Similarly, by referring to the below Action Planner query, it may be noted that queries for robot and action models may be triggered, which can further trigger one or more runtime perceptions, for example, lack of robotic capabilities. A function to trigger a re-planning replan_action that identifies one or more exceptions and add robotic capabilities, for example, a new robotic model and / or an action template, may also be introduced. Considering an example scenario, the process may be shown via below set of Orc codes: +++ Robotic Action Planner +++ -- Knowledge Base, Perception and Exception Pointers include " KB.inc " val perception = Dictionary () val exception = Dictionary () -- Queries for world and object templates, with Perception def query1 ( v,db ) = Ift member ( v,db ) >> ( v,db ) | Iff member ( v,db ) >> perception.p := v >> perception.p? def world (w) = query1 (w, world_model ) def target (o) = query1 (o, object_template ) --- Queries for robot capabilities and actions, with exceptions def query2 ( v,db ) = Ift member ( v,db ) >> ( v,db ) | Iff member ( v,db ) >> exception.ex := v >> add_capabilities ( v,db ) def robot (r) = query2 (r, robot_cap ) def action (a) = query2 (a, task_template ) >> ( query2 ((" navigation "," task "," manipulation ") ,robot_algo )) | replan_action (a)) -- Replanning procedures for runtime exceptions def add_capabilities ( v,db ) = merge (db,[v]) def replan_action (a) = Ift ( exception.ex? = null ) >> stop | Iff ( exception.ex? = null ) >> exception.ex := null >> action (a) In an embodiment, an output of a robotic task may now be presented, wherein the robotic task is an input task identical to other robotic tasks that are planned during robotic design time. Upon obtaining the query results from each of the plurality of robotic knowledge bases, a robotic action may be performed, wherein the robotic action comprises navigation, manipulation and task completion, as shown in the below Design Time Simulation query. However, the robotic action in such a base do not comprises the one or more robotic perceptions and / or exceptions. Considering an example scenario, the process may be shown via below set of Orc codes: +++ Design Time Simulation +++ -- Input goals robot (" picker ") | action (" pick ") | object (" ball ") | world (" rack ") --Output ------------------------------------------- Target Query Triggered for ball World Model Query Triggered for rack Robot Capability Query Triggered for picker Action Query Triggered for pick (" ball ", [" ball ", " cube "]) (" picker ", [" picker "]) (" rack ", [" warehouse ", " rack "]) ((" navigation ", [" navigation ", " manipulation ", " task "]), (" task ", [" navigation ", " manipulation " , " task "]), (" manipulation ", [" navigation ", " manipulation ", " task "])) Action Completed pick The runtime deployment comprises an important aspect of autonomous robotic deployment. The robotic tasks (to be executed) may be modified with the robot type replaced by a mover, an action collect, and the target object replaced by a cylinder. As these requirements are not pre-populated into a graph knowledge base, the one or more hardware processors 104 are configured to trigger adaptation and exception handling procedures, as shown in the algorithm supra. By referring to below Runtime Adaptation Simulation query, it may be noted that a perception is triggered to identify the target cylinder. Further, the one or more hardware processors 104 may also be configured to trigger the exceptions in case of an absence of collect actions and mover robot capabilities, that are further added in to one or more of the plurality of robotic knowledge bases (as shown below in the Runtime Adaptation Simulation query). Based upon the runtime simulation / adaption, the execution of a robotic task may be considered as completed. Considering an example scenario, the process may be shown via below set of Orc codes: +++ Runtime Adaptation Simulation +++ -- Input goals robot (" mover ") | action (" collect ") | target (" cylinder ") | world (" rack ") --Output ------------------------------------------- Action Query Triggered for collect Target Query Triggered for cylinder Robot Capability Query Triggered for mover World Model Query Triggered for rack (" rack ", [" warehouse ", " rack "]) Perception Trigged for cylinder "cylinder" Action Replan Triggered Exception Trigged for mover Adding knowledge of mover Updated KB ["mover", "picker"] Action Query Triggered for collect Exception Trigged for collect Adding knowledge of collect Updated KB ["collect", "pick", "drop", "assign"] ((" navigation ", [" navigation ", " manipulation ", " task "]), (" task ", [" navigation ", " manipulation " , " task "]), (" manipulation ", [" navigation ", " manipulation ", " task "])) Action Completed collect Performance analysis – In an embodiment, to estimate query and update time in OrientDB graph databases, the method disclosed provides for executing a following test on a Linux™ system with 4 core i5-6200U CPU @ 2.30GHz, 4GB RAM, which simulates a hyper-connected graph traversal over 50 nodes: Starting workload GSP (ConcurrencyLvel=4)… Workload in progress 100% [Shortest paths blocks (block size=50) executed : 50/50] Total execution time: 2.768 seconds Executed 50 shortest paths in 2.762 seconds Path depth: maximum 8, average 5.286, not connected 0 Throughput: 18.103/seconds (Average 55.240ms/op) Latency Average: 2.11996ms/op (58th percentile) – Min: 55.838ms – 99th percentile: 576.653ms – 99.9th percentile 576.653ms – Max 576.653ms – Conflicts:0 It may be noted that the average graph traversal latency is observed to be around 211 milliseconds, that outperforms traditional perceptions and object recognition algorithms (that is, 2300 milliseconds referred to in “Hao Zhang et al., “DoraPicker: An autonomous picking system for general objects”, IEEE Intl. Conf. on Automation Science and Engineering (CASE), pp. 721–726, 2016”). Using said mean values for exponentially distributed latency outputs, Monte-Carlo runs may be performed over 20,000 runs. By referring to FIG. 7, outputs for various cases with a knowledge base comprising 100%, 90% and 70% of the output from robotic action plans generated (triggering perception and exception handling in case of missing knowledge) may be referred. Considering an example scenario, by referring to FIG. 7 again, it may be noted that over a base case of 70% robotic plan information in a knowledge base, the 95% percentile latency improves by 56.5% (90% queries answered by a knowledge base) and by 73.9% (90% queries answered by a knowledge base). By referring to FIG. 7 yet again, it may be noted that continuous learning and runtime updates have a significant impact on autonomous robotics performance, thereby signifying importance of updating each of the plurality of robotic knowledge bases. According to an embodiment of the present disclosure, at step 404, the one or more hardware processors 104 are configured to implement, via the adaptation monitoring module 204, one or more integrity constraints on the runtime simulations for generating the autonomous robotic action framework (The autonomous robotic action framework has already been discussed in preceding paragraphs). The one or more integrity constraints comprise a set of rules which define a set of consistent database states or changes of state. In an embodiment, three types of integrity checks that may be implemented comprise: Schema instance - providing for entity types and type checking integrity; Referential integrity – The referential integrity checks that the nodes and edges are uniquely named and that the edges are provided with labels and start/end vertices; and Functional dependencies – providing for value restrictions on particular attributes. Defining minimum and maximum property value. In an embodiment, the one or more integrity constraints are implemented by a knowledge graph update integrity verification technique. The knowledge graph update integrity verification technique comprises using a set of below Orc code for updating the plurality of knowledge bases. By referring to the below Orc code, it may be noted that the below Orc code comprises type checking, redundancy of input data and valid range of properties. When a robot produces a runtime update, the site update_node(key,value) checks for integrity before pushing it to a relevant knowledge base. Orc code: -- Type information type world_model = {. Name :: String, Colour :: String, Location :: ( Number,Number,Number ) .} val new_world_model = Dictionary () -- Integrity check site def class integrity ()= val range = range def redundancy_check ( key,value ) = Ift ( key = value ) >> false | Iff ( key = value ) >> true def value_check ( key,range ) = Ift ( member ( key,range )) >> true | Iff ( member ( key,range )) >> false stop def class update ()= def update_node ( key,value ) = Read ( key ) >aa> ( integrity.redundancy_check ( key,value ) ,integrity.value_check ( key,value )) >> new_world_model.aa := value stop According to an embodiment of the present disclosure, advantages of the proposed autonomous robotic action framework may now be considered in detail. By implementing the proposed autonomous robotic action framework in industrial deployment, information or data or knowledge about robotic world models and capabilities may be encoded in efficient graph database models that may be efficiently queried to extract information for task completion. Using the concurrent programming language Orc, action plans are generated that can handle robotic runtime exceptions and perception information. The proposed framework facilitates efficient implementation of robotic knowledge bases that may be queried during robotic planning and execution. Further, the proposed framework constantly updates robotic knowledge bases via the adaption monitoring module. At runtime, the one or more integrity constraints on the robotic world models observations are used to updates. The graph database querying technique provides extensibility and reuse of information across multiple autonomous robotic deployments. Finally, as shown and discussed above, the graph databases provide efficient storage structures for graphs, thus reducing computational complexity in operations. The written description describes the subject matter herein to enable any person skilled in the art to make and use the embodiments. The scope of the subject matter embodiments is defined by the claims and may include other modifications that occur to those skilled in the art. Such other modifications are intended to be within the scope of the claims if they have similar elements that do not differ from the literal language of the claims or if they include equivalent elements with insubstantial differences from the literal language of the claims. It is to be understood that the scope of the protection is extended to such a program and in addition to a computer-readable means having a message therein; such computer-readable storage means contain program-code means for implementation of one or more steps of the method, when the program runs on a server or mobile device or any suitable programmable device. The hardware device can be any kind of device which can be programmed including e.g. any kind of computer like a server or a personal computer, or the like, or any combination thereof. The device may also include means which could be e.g. hardware means like e.g. an application-specific integrated circuit (ASIC), a field-programmable gate array (FPGA), or a combination of hardware and software means, e.g. an ASIC and an FPGA, or at least one microprocessor and at least one memory with software modules located therein. Thus, the means can include both hardware means and software means. The method embodiments described herein could be implemented in hardware and software. The device may also include software means. Alternatively, the embodiments may be implemented on different hardware devices, e.g. using a plurality of CPUs. The embodiments herein can comprise hardware and software elements. The embodiments that are implemented in software include but are not limited to, firmware, resident software, microcode, etc. The functions performed by various modules described herein may be implemented in other modules or combinations of other modules. For the purposes of this description, a computer-usable or computer readable medium can be any apparatus that can comprise, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device. The illustrated steps are set out to explain the exemplary embodiments shown, and it should be anticipated that ongoing technological development will change the manner in which particular functions are performed. These examples are presented herein for purposes of illustration, and not limitation. Further, the boundaries of the functional building blocks have been arbitrarily defined herein for the convenience of the description. Alternative boundaries can be defined so long as the specified functions and relationships thereof are appropriately performed. Alternatives (including equivalents, extensions, variations, deviations, etc., of those described herein) will be apparent to persons skilled in the relevant art(s) based on the teachings contained herein. Such alternatives fall within the scope and spirit of the disclosed embodiments. Also, the words “comprising,” “having,” “containing,” and “including,” and other similar forms are intended to be equivalent in meaning and be open ended in that an item or items following any one of these words is not meant to be an exhaustive listing of such item or items, or meant to be limited to only the listed item or items. It must also be noted that as used herein and in the appended claims, the singular forms “a,” “an,” and “the” include plural references unless the context clearly dictates otherwise. Furthermore, one or more computer-readable storage media may be utilized in implementing embodiments consistent with the present disclosure. A computer-readable storage medium refers to any type of physical memory on which information or data readable by a processor may be stored. Thus, a computer-readable storage medium may store instructions for execution by one or more processors, including instructions for causing the processor(s) to perform steps or stages consistent with the embodiments described herein. The term “computer-readable medium” should be understood to include tangible items and exclude carrier waves and transient signals, i.e., be non-transitory. Examples include random access memory (RAM), read-only memory (ROM), volatile memory, nonvolatile memory, hard drives, CD ROMs, DVDs, flash drives, disks, and any other known physical storage media. It is intended that the disclosure and examples be considered as exemplary only, with a true scope and spirit of disclosed embodiments being indicated by the following claims.

Documents

Application Documents

# Name Date
1 201821045158-IntimationOfGrant12-09-2023.pdf 2023-09-12
1 201821045158-STATEMENT OF UNDERTAKING (FORM 3) [29-11-2018(online)].pdf 2018-11-29
2 201821045158-PatentCertificate12-09-2023.pdf 2023-09-12
2 201821045158-REQUEST FOR EXAMINATION (FORM-18) [29-11-2018(online)].pdf 2018-11-29
3 201821045158-FORM 18 [29-11-2018(online)].pdf 2018-11-29
3 201821045158-FER.pdf 2021-10-18
4 201821045158-FORM 1 [29-11-2018(online)].pdf 2018-11-29
4 201821045158-COMPLETE SPECIFICATION [27-07-2021(online)].pdf 2021-07-27
5 201821045158-FIGURE OF ABSTRACT [29-11-2018(online)].jpg 2018-11-29
5 201821045158-FER_SER_REPLY [27-07-2021(online)].pdf 2021-07-27
6 201821045158-OTHERS [27-07-2021(online)].pdf 2021-07-27
6 201821045158-DRAWINGS [29-11-2018(online)].pdf 2018-11-29
7 201821045158-ORIGINAL UR 6(1A) FORM 1-180319.pdf 2020-01-13
7 201821045158-COMPLETE SPECIFICATION [29-11-2018(online)].pdf 2018-11-29
8 201821045158-ORIGINAL UR 6(1A) FORM 26-030119.pdf 2019-05-14
8 201821045158-FORM-26 [29-12-2018(online)].pdf 2018-12-29
9 201821045158-Proof of Right (MANDATORY) [12-03-2019(online)].pdf 2019-03-12
9 Abstract1.jpg 2019-01-15
10 201821045158-Proof of Right (MANDATORY) [12-03-2019(online)].pdf 2019-03-12
10 Abstract1.jpg 2019-01-15
11 201821045158-FORM-26 [29-12-2018(online)].pdf 2018-12-29
11 201821045158-ORIGINAL UR 6(1A) FORM 26-030119.pdf 2019-05-14
12 201821045158-COMPLETE SPECIFICATION [29-11-2018(online)].pdf 2018-11-29
12 201821045158-ORIGINAL UR 6(1A) FORM 1-180319.pdf 2020-01-13
13 201821045158-DRAWINGS [29-11-2018(online)].pdf 2018-11-29
13 201821045158-OTHERS [27-07-2021(online)].pdf 2021-07-27
14 201821045158-FER_SER_REPLY [27-07-2021(online)].pdf 2021-07-27
14 201821045158-FIGURE OF ABSTRACT [29-11-2018(online)].jpg 2018-11-29
15 201821045158-COMPLETE SPECIFICATION [27-07-2021(online)].pdf 2021-07-27
15 201821045158-FORM 1 [29-11-2018(online)].pdf 2018-11-29
16 201821045158-FER.pdf 2021-10-18
16 201821045158-FORM 18 [29-11-2018(online)].pdf 2018-11-29
17 201821045158-PatentCertificate12-09-2023.pdf 2023-09-12
17 201821045158-REQUEST FOR EXAMINATION (FORM-18) [29-11-2018(online)].pdf 2018-11-29
18 201821045158-STATEMENT OF UNDERTAKING (FORM 3) [29-11-2018(online)].pdf 2018-11-29
18 201821045158-IntimationOfGrant12-09-2023.pdf 2023-09-12

Search Strategy

1 2021-01-1114-31-40E_11-01-2021.pdf

ERegister / Renewals

3rd: 29 Nov 2023

From 29/11/2020 - To 29/11/2021

4th: 29 Nov 2023

From 29/11/2021 - To 29/11/2022

5th: 29 Nov 2023

From 29/11/2022 - To 29/11/2023

6th: 29 Nov 2023

From 29/11/2023 - To 29/11/2024

7th: 29 Nov 2024

From 29/11/2024 - To 29/11/2025