Abstract: Disclosed is a robotic system (104) and a method (600) having a control unit (202). The control unit is configured to generate a virtual environment having a virtual robot (402) based on a received set of first parameters to emulate a real environment. The control unit (202) is configured to actuate the virtual robot (402) to simulate an operation of a robot (108). At least the virtual robot (402) actuates independently, or the robot (108) and the virtual robot (402) actuate in synchrony based on the simulation. Further, at least the virtual robot (402) actuates independently, depending on the set of second parameters or the robot (108) and the virtual robot (402) actuates, depending on the set of second parameters. This configuration ensures multi modal operation of the robotic system (104), thus ensuring ease of operation of the robot (108). <
Description:TECHNICAL FIELD
[0001] The present disclosure relates to a collaborative robot, and more particularly relates to a system and a method for operating the collaborative robot in a virtual environment and a real environment to perform several operations.
BACKGROUND
[0002] Until recently, many human scale tasks have been highly reliant on human workers with the benefit of reconfiguring them on-the-job. Further, automating the human workers has not been possible with industrial robots or by other assembly line automation schemes as they need highly structured workflows and require special safety fences. Further this, in effect, isolates them from humans and inhibits safe human-robot interaction to flexibly reconfigure tasks. In recent years, with advancement of technology, it has become possible to automate human-scale tasks more feasibly with collaborative robots, also known as cobots to deploy and work alongside humans safely. They promise to provide flexibility to reconfigure workflows and also increase productivity and precision not possible by humans alone.
[0003] However, collaborative robots have certain limitations, such as the collaborative robots are mostly programmed first on a user equipment, for example, a laptop, a computer and then scheduled to perform the desired task on a workpiece. They are programmed very similar to industrial robots limiting their potential. When the operator has to change the operation of the collaborative robots, the operator has to first change the program on the user equipment and then the collaborative robots are scheduled to perform the changed operation. This increases the hassle for the operator that the operator has to change the program every time for the operation and then the collaborative robots are configured to operate. Thus, when the program provided by the operator on the user equipment does not match the operation to be performed on the workpiece by the collaborative robots, the safety of the operator is compromised and hampers the productivity of the collaborative robots. Thus, there is a need to provide an interaction environment that overcomes all the above mentioned drawbacks of robots in general and existing limitations of collaborative robots.
[0004] Many technological solutions have been developed to overcome the abovementioned problems. For instance, a known art discloses a kind of a robot graphic programming interactive system based on a webpage and a mobile terminal. However, this configuration also suffers from the limitation that the operator accesses the programming separately on the user equipment and thus, is unable to monitor the robot simulation on the user equipment simultaneously with the programming. Further, this also inhibits the promise of flexibility to reconfigure the workflows by the end user. So, there lies the possibility that the program provided by the operator on the user equipment does not match the operation to be performed on the workpiece by the collaborative robots. This raises safety concern for the operator and hampers the productivity of the collaborative robots.
[0005] Therefore, in view of the above-mentioned problems, it is advantageous to provide a system and a method that can overcome one or more above-mentioned problems.
SUMMARY
[0006] This summary is provided to introduce a selection of concepts, in a simplified format, that are further described in the detailed description of the invention. This summary is neither intended to identify key or essential inventive concepts of the invention and nor is it intended for determining the scope of the invention.
[0007] The present disclosure discloses a robotic system. The robotic system includes a control unit. The control unit is configured to receive a set of first parameters corresponding to a real environment. The set of first parameters includes at least one of a physical parameter associated with dimensions of the real environment, a spatial location and orientation of a workspace, and a spatial information of the workspace. The control unit is configured to receive a set of second parameters associated with an operation of a robot. The set of second parameters includes at least one of an end-of-arm tool integrated with the robot, an operation instruction of the robot, a type of the robot, digital and analog signals from external devices, a motion of the robot, and actions of the robot. The control unit is configured to generate a virtual environment having a virtual robot based on the received set of first parameters to emulate the real environment. The control unit is configured to actuate the virtual robot to simulate the operation of the robot. At least the virtual robot actuates independently, or the robot and the virtual robot actuate in synchrony based on the simulation. Further, at least the virtual robot actuates independently, depending on the set of second parameters or the robot and the virtual robot actuates, depending on the set of second parameters.
[0008] In another embodiment, also disclosed herein is a method to operate the robotic system. The method includes receiving, by a control unit, a set of first parameters from a robot corresponding to a real environment. The set of first parameters includes at least one of a physical parameter associated with dimensions of the real environment, a spatial location and orientation of a workspace, and a spatial information of the workspace. The method includes receiving, by the control unit, a set of second parameters associated with an operation of the robot in the real environment. The set of second parameters includes at least one of an end-of-arm tool integrated with the robot, an operation instruction of the robot, a type of the robot, digital and analog signals from external devices, a motion of the robot, and actions of the robot. The method further includes generating, by the control unit, a virtual environment having a virtual robot based on the received set of first parameters to emulate the real environment. Lastly, the method includes actuating, by the control unit, the virtual robot to simulate the robot. At least the virtual robot actuates independently, or the robot and the virtual robot actuates in synchrony based on the simulation. Further, at least the virtual robot actuates independently, depending on the set of second parameters, or the robot and the virtual robot actuates, depending on the set of second parameters.
[0009] To further clarify the advantages and features of the present invention, a more particular description of the invention will be rendered by reference to specific embodiments thereof, which is illustrated in the appended drawings. It is appreciated that these drawings depict only typical embodiments of the invention and are therefore not to be considered limiting of its scope. The invention will be described and explained with additional specificity and detail with the accompanying drawings.
BRIEF DESCRIPTION OF THE DRAWINGS
[0010] These and other features, aspects, and advantages of the present invention will become better understood when the following detailed description is read with reference to the accompanying drawings in which like characters represent like parts throughout the drawings, wherein:
Figure 1 illustrates an environment of a robotic system communicably coupled with a user equipment (UE) and a robot, in accordance with an embodiment of the present disclosure;
Figure 2 illustrates a block diagram of the robotic system, in accordance with an embodiment of the present disclosure;
Figure 3 illustrates schematic view of the robot, in accordance with an embodiment of the present disclosure;
Figures 4A-4B illustrate a virtual robot on a display unit of the UE, in accordance with an embodiment of the present disclosure;
Figure 5 illustrates multi-modal operations performed by the robotic system, in accordance with an embodiment of the present disclosure;
Figure 6 illustrates a method performed by the robotic system, in accordance with an embodiment of the present disclosure; and
Figures 7 illustrate use case of the system, in accordance with an embodiment of the present disclosure.
[0011] Further, skilled artisans will appreciate that elements in the drawings are illustrated for simplicity and may not have necessarily been drawn to scale. Furthermore, in terms of the construction of the device, a plurality of components of the device may have been represented in the drawings by conventional symbols, and the drawings may show only those specific details that are pertinent to understanding the embodiments of the present invention so as not to obscure the drawings with details that will be readily apparent to those of ordinary skill in the art having the benefit of the description herein.
DETAILED DESCRIPTION OF FIGURES
[0012] For the purpose of promoting an understanding of the principles of the invention, reference will now be made to the embodiment illustrated in the drawings and specific language will be used to describe the same. It will nevertheless be understood that no limitation of the scope of the invention is thereby intended, such alterations and further modifications in the illustrated system, and such further applications of the principles of the invention as illustrated therein being contemplated as would normally occur to one skilled in the art to which the invention relates. Unless otherwise defined, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skilled in the art to which invention belongs. The system and examples provided herein are illustrative only and not intended to be limiting.
[0013] For example, the term “some” as used herein may be understood as “none” or “one” or “more than one” or “all.” Therefore, the terms “none,” “one,” “more than one,” “more than one, but not all” or “all” would fall under the definition of “some.” It should be appreciated by a person skilled in the art that the terminology and structure employed herein is for describing, teaching, and illuminating some embodiments and their specific features and elements and therefore, should not be construed to limit, restrict, or reduce the spirit and scope of the present disclosure in any way.
[0014] For example, any terms used herein, such as “includes,” “comprises,” “has,” “consists,” and similar grammatical variants do not specify an exact limitation or restriction, and certainly do not exclude the possible addition of a plurality of features or elements, unless otherwise stated. Further, such terms must not be taken to exclude the possible removal of the plurality of the listed features and elements, unless otherwise stated, for example, by using the limiting language including, but not limited to, “must comprise” or “needs to include.”
[0015] Whether or not a certain feature or element was limited to being used only once, it may still be referred to as “plurality of features” or “plurality of elements” or “at least one feature” or “at least one element.” Furthermore, the use of the terms “plurality of” or “at least one” feature or element do not preclude there being none of that feature or element, unless otherwise specified by limiting language including, but not limited to, “there needs to be plurality of…” or “plurality of elements is required.”
[0016] Unless otherwise defined, all terms and especially any technical and/or scientific terms, used herein may be taken to have the same meaning as commonly understood by a person ordinarily skilled in the art.
[0017] Reference is made herein to some “embodiments.” It should be understood that an embodiment is an example of a possible implementation of any features and/or elements of the present disclosure. Some embodiments have been described for the purpose of explaining plurality of the potential ways in which the specific features and/or elements of the proposed disclosure fulfil the requirements of uniqueness, utility, and non-obviousness.
[0018] Use of the phrases and/or terms including, but not limited to, “a first embodiment,” “a further embodiment,” “an alternate embodiment,” “one embodiment,” “an embodiment,” “multiple embodiments,” “some embodiments,” “other embodiments,” “further embodiment”, “furthermore embodiment”, “additional embodiment” or other variants thereof do not necessarily refer to the same embodiments. Unless otherwise specified, plurality of particular features and/or elements described in connection with plurality of embodiments may be found in one embodiment, or may be found in more than one embodiment, or may be found in all embodiments, or may be found in no embodiments. Although plurality of features and/or elements may be described herein in the context of only a single embodiment, or in the context of more than one embodiment, or in the context of all embodiments, the features and/or elements may instead be provided separately or in any appropriate combination or not at all. Conversely, any features and/or elements described in the context of separate embodiments may alternatively be realized as existing together in the context of a single embodiment.
[0019] Any particular and all details set forth herein are used in the context of some embodiments and therefore should not necessarily be taken as limiting factors to the proposed disclosure.
[0020] Embodiments of the present invention will be described below in detail with reference to the accompanying drawings.
[0021] Figure 1 illustrates an environment 100 of a robotic system 104 communicably coupled with a user equipment (UE) 102 and a robot 108, in accordance with an embodiment of the present disclosure. Figure 2 illustrates a block diagram of the robotic system 104, in accordance with an embodiment of the present disclosure. Figure 3 illustrates a schematic view of the robot 108, in accordance with an embodiment of the present disclosure. Figures 4A-4B illustrate a virtual robot 402 on a display unit 106 of the UE 102, in accordance with an embodiment of the present disclosure. Figure 5 illustrates multi-modal operations performed by the robotic system 104, in accordance with an embodiment of the present disclosure.
[0022] In an embodiment, the user equipment (UE) 102 may be a laptop, desktop, a mobile or any other electronic device, without departing from the scope of the present disclosure. In an embodiment, the user equipment 102 includes the display unit 106, without departing from the scope of the present disclosure. Further, the robotic system 104 may be communicatively coupled with the UE 102 and the robot 108, without departing from the scope of the present disclosure. In another embodiment, the robotic system 104 may be coupled with a plurality of UEs, without departing from the scope of the present disclosure. In an embodiment, the robotic system 104 as disclosed facilitates multi modal operation, thus, ensuring ease of operation of the robot 108 in the real environment, without departing from the scope of the present disclosure.
[0023] Further, the robotic system 104 may be configured to actuate the virtual robot 402 independently or along with the robot 108, without departing from the scope of the present disclosure.
[0024] In an embodiment, the robotic system 104 may include, but is not limited to, a control unit 202 among other examples which are explained in detail in subsequent paragraphs.
[0025] The control unit 202 may be communicatively coupled to the UE 102 and the robot 108 simultaneously, without departing from the scope of the present disclosure. In an embodiment, the control unit 202 includes a processor/controller 204, a memory 206, module(s) 208. The memory 206, in one example, may store the instructions to carry out the operations of the modules 208. The modules 208 and the memory 206 may be coupled to the processor 204.
[0026] The processor 204 can be a single processing unit or several units, all of which could include multiple computing units. The processor 204 may be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processor, central processing units, state machines, logic circuitries, and/or any devices that manipulate signals based on operational instructions. Among other capabilities, the processor 204 is configured to fetch and execute computer-readable instructions and data stored in the memory 206. The processor 204 may include one or a plurality of processors. At this time, one or a plurality of processors may be a general-purpose processor, such as a central processing unit (CPU), an application processor (AP), or the like, a graphics-only processing unit such as a graphics processing unit (GPU), a visual processing unit (VPU), and/or an AI-dedicated processor such as a neural processing unit (NPU). The one or a plurality of processors control the processing of the input data in accordance with a predefined operating rule or artificial intelligence (AI) model stored in the non-volatile memory and the volatile memory. The predefined operating rule or machine learning model is provided through training or learning.
[0027] The memory 206 may include any non-transitory computer-readable medium known in the art including, for example, volatile memory, such as static random-access memory (SRAM) and dynamic random-access memory (DRAM), and/or non-volatile memory, such as read-only memory (ROM), erasable programmable ROM, flash memories, hard disks, optical disks, and magnetic tapes.
[0028] The modules 208, amongst other things, include routines, programs, objects, components, data structures, etc., which perform particular tasks or implement data types. The modules 208 may also be implemented as, signal processor(s), state machine(s), logic circuitries, and/or any other device or component that manipulates signals based on operational instructions.
[0029] Further, the modules 208 can be implemented in hardware, instructions executed by a processing unit, or by a combination thereof. The processing unit can comprise a computer, a processor, such as the processor 204, a state machine, a logic array, or any other suitable devices capable of processing instructions. The processing unit can be a general-purpose processor which executes instructions to cause the general-purpose processor to perform the required tasks or, the processing unit can be dedicated to performing the required functions. In another embodiment of the present disclosure, the modules 208 may be machine-readable instructions (software) which, when executed by the processor 204/processing unit, perform any of the described functionalities. Further, the data serves, amongst other things, as a repository for storing data processed, received, and generated by one or more of the modules 208.
[0030] The modules 208 may perform different functionalities which may include, but may not be limited to actuating the virtual robot 402 independently or along with the robot 108. Accordingly, the modules 208 may include a receiving module 210, a generating module 212, and an actuating module 214. Each of the receiving module 210, the generating module 212, and the actuating module 214 may be in communication with each other. Further, each of the receiving module 210, the generating module 212, and the actuating module 214 may be in communication with the processor 204.
[0031] The control unit 202, particularly, the receiving module 210, the generating module 212, and the actuating module 214 may be operated to perform a specific task to actuate the virtual robot 402 independently or along with the robot 108, which is explained in the subsequent paragraphs in conjunction with Figure 1 to Figure 4B.
[0032] In an embodiment, the receiving module 210 may be configured to receive a set of first parameters corresponding to a real environment. In an embodiment, the set of first parameters includes at least one of a physical parameter associated with dimensions of the real environment, a spatial location and orientation of a workspace, and a spatial information of the workspace, without departing from the scope of the present disclosure. In an embodiment, the control unit 202 may be coupled to a repository 216 configured to store a plurality of models of the workspace. In an embodiment, the repository 216 stores a plurality of objects, without departing from the scope of the present disclosure. In an embodiment, the repository 216 may be in the control unit 202, without departing from the scope of the present disclosure. In another embodiment, the repository 216 may be a cloud storage, without departing from the scope of the present disclosure.
[0033] Further, in an embodiment, the receiving module 210 may be configured to receive a set of second parameters associated with an operation of the robot 108. The set of second parameters includes at least one of an end-of arm tool integrated with the robot 108, an operation instruction of the robot 108, a type of the robot 108, digital and analog signals from external devices, a motion of the robot 108, and actions of the robot 108, without departing from the scope of the present disclosure. In an embodiment, the end-of-arm tool may be attached to a distal end of an arm of the robot 108. Further, the end-of-arm tool may be adapted to assist the operation of the robot 108 in the real environment.
[0034] In an embodiment, the receiving module 210 receives the set of first parameters and the set of second parameters from the robot 108. In an embodiment, the robot 108 may be a real robot, without departing from the scope of the present disclosure. In an embodiment, the real robot may be referred as the robot 108, without departing from the scope of the present disclosure. The robot 108 may be adapted to generate the set of first parameters and the set of second parameters by one of a static mapping and a dynamic mapping, without departing from the scope of the present disclosure.
[0035] In an embodiment, the static mapping includes defining the spatial location and orientation of the workspace of the real environment by moving the arm of the robot 108 by a user input over the workspace. The robot 108 determines the spatial location and orientation of the workspace. Further, the determined spatial location and orientation of the workspace may be communicated as one of the set of first parameters and the set of second parameters to the user equipment 102 through the control unit 202 to generate a virtual environment having the virtual robot 402 (as shown in Figures 4A-4B). In one example, a user/operator may move the arm of the robot 108, manually or automatically, over the workspace to determine the spatial location and orientation of the workspace and communicate as one of the set of first parameters and the set of second parameters to the user equipment 102. In an embodiment, the workspace may be generated in the UE 102 when the UE 102 receives tactile and position information from the robot 108 and also, accordingly, generate size and shape of objects of the workspace.
[0036] Further, in an embodiment, the dynamic mapping includes defining the spatial location and orientation of the workspace having a machine-readable code. The robot 108 detects the machine-readable code and communicates as the one of the set of first parameters and the set of second parameters to the user equipment 102 through the control unit 202 to form the virtual environment having the virtual robot 402 (as shown in Figures 4A-4B). In one example, the robot 108 includes a camera to read/detect a machine readable code of the workspace and accordingly communicate as the one of the set of first parameters and the set of second parameters to the user equipment 102 through the control unit 202. Further, the user operating the UE 102 after receiving the one of the set of first parameters and the set of second parameters, select the objects from the repository 216 having stored objects and thus forms the virtual environment.
[0037] In an embodiment, the robot 108 communicates to the UE 102 through the control unit 202 by a plurality of components. In an embodiment, the robot 108 comprises a plurality of links 302 and a plurality of joints 304. In an embodiment, the plurality of joints 304 may be serially connected with each other with the plurality of links 302. The plurality of joints 304 includes a plurality of sensors, a drive controller, and a plurality of actuators. Further, the plurality of sensors, the drive controller, and the plurality of actuators communicate a value indicative of the set of first parameters and the set of second parameters to the virtual robot 402 on the UE 102 through the control unit 202.
[0038] In an embodiment, after receiving the set of first parameters, the generating module 212 generates the virtual environment having the virtual robot 402, where the virtual environment emulates the real environment. In an embodiment, the virtual robot 402 may be based on predetermined data stored in the repository 216 coupled to the control unit 202. Further, in an embodiment, the virtual robot 402 may be adapted to mimic the operation of the robot 108 in the virtual environment (as shown in Figures 4A-4B).
[0039] In an embodiment, the virtual environment having the virtual robot 402 may be further developed by an input provided through the UE 102. The input may be a set of third parameters including one or more functional block programming, cartesian coordinates, jog interface, and defining instructions for the operation of the robot 108. Further, the display unit 106 of the UE 102 may be configured to display a graphical programming interface. The graphical programming interface provides access to the virtual robot 402 in the virtual environment and the input simultaneously on the display unit 106. This configuration eliminates the requirement of switching between the virtual robot in the virtual environment and the functional block programming, thus, providing comfort to the user and eliminating chances of error in the operation.
[0040] In an embodiment, after generating the virtual environment having the virtual robot 108, the actuation module 214 actuates the virtual robot 402 to simulate the operation of the robot 108. In an embodiment, at least the virtual robot 402 actuates independently. Further, in an embodiment, the robot 108 and the virtual robot 402 actuate in synchrony based on the simulation. Further, at least the virtual robot 402 actuates independently, depending on the set of second parameters, or the robot 108 and the virtual robot 402 actuates in synchrony, depending on the set of second parameters.
[0041] In an embodiment, the actuation module 214 may be configured to actuate the robot 108 in the real environment in synchrony with the virtual robot 402 in the virtual environment. In an embodiment, the robot 108 is actuated in the real environment to execute an operation based on the simulation of the operation of the robot 108 in the virtual environment depending on the set of first parameters and the set of second parameters.
[0042] Further, in an embodiment, the actuation module 214 may be configured to actuate the virtual robot 402 by the input provided in the UE 102. Further, the receiving module 210 of the control unit 202 receives a value indicative associated with the actuation of the virtual robot 402 in the virtual environment based on the set of second parameters and the input. Further, the control unit 202 communicates the value to the robot 108 in the real environment to actuate the robot 108 in the real environment.
[0043] In another embodiment, the robot 108 may actuate in the virtual environment (as shown in Figure 5). For instance, the virtual robot 402 may mimic the robot 108 and operate in the virtual environment, without departing from the scope of the present disclosure. This configuration ensures ease of programming of the robot 108. This configuration ensures testing and simulation of the robot 108. Further, this configuration also assists the operation of the robot 108 in the real environment accurately and efficiently.
[0044] In yet another embodiment, the virtual robot 402 may operate in the real environment (as shown in Figure 5), without departing from the scope of the present disclosure. For instance, the user may configure the virtual robot 402 from the repository 216 as per the requirement or the virtual robot 402 may mimic the real robot 108. Further, the virtual environment emulates the real environment and the virtual robot 402 operates in the virtual environment. Thus, this configuration provides accuracy of the working of the robot 108, and may predict and avoid collision of the robot 108. This configuration provides flexibility to the operator/user to check compatibility of the robot 108 in the real environment and thus, increases the efficiency of the overall operation. Further, this configuration provides advanced planning of the operation even when the robot 108 is not available due to reasons like maintenance, etc.
[0045] Figure 6 illustrates a method 600 performed by the robotic system 104, in accordance with an embodiment of the present disclosure.
[0046] The method 600 can be performed by programmed computing devices, for example, based on instructions retrieved from non-transitory computer readable media. The computer readable media can include machine-executable or computer-executable instructions to perform all or portions of the described method. The computer readable media may be, for example, digital memories, magnetic storage media, such as magnetic disks and magnetic tapes, hard drives, or optically readable data storage media.
[0047] The method 600 includes a series of operations shown at step 602 through step 608 of Figure 6. The method 600 may be performed by the robotic system 104 in conjunction with the control unit 202, the details of which are explained in conjunction with Figures 1 to 5, and the same are not repeated here for the sake of brevity in the present disclosure. The method 600 begins at step 602.
[0048] At step 602, the method 600 includes receiving, by the control unit 202, the set of first parameters from the robot corresponding to the real environment. The set of first parameters includes the at least one of the physical parameters associated with dimensions of the real environment, the spatial location and orientation of the workspace, and the spatial information of the workspace.
[0049] At step 604, the method 600 includes receiving, by the control unit 202, the set of second parameters associated with the operation of the robot 108 in the real environment. The set of second parameters includes the at least one of the end-of-arm tools integrated with the robot 108, the operation instruction of the robot 108, the type of the robot 108, digital and analog signals from external devices, the motion of the robot 108, and actions of the robot 108.
[0050] At step 606, the method 600 includes generating, by the control unit 202, the virtual environment having the virtual robot 402 based on the received set of first parameters to emulate the real environment.
[0051] At step 608, the method 600 includes actuating, by the control unit 202, the virtual robot 402 to simulate the robot 108. Further, at least the virtual robot 402 actuates independently, or the robot 108 and the virtual robot 402 actuate in synchrony based on the simulation. Furthermore, at least the virtual robot 402 actuates independently, depending on the set of second parameters or the robot 108 and the virtual robot 402 actuates, depending on the set of second parameters.
[0052] Figures 7 illustrate use case of the robotic system 104, in accordance with an embodiment of the present disclosure.
[0053] Referring to Figure 7, in one example, the robot 108, after actuating by the operations as explained from Figure 1 to 6, operates on a mobile station 702 and places materials on the workspace precisely through a tool gripper provided at the one of the end-of-arm tool. Further, the user also works simultaneously on the workspace with the robot 108, thus ensuring the safety of the user.
[0054] As would be gathered, the robotic system 104 and the method 600 as disclosed provide a comprehensive approach to generate the virtual environment having the virtual robot 402, where the virtual environment emulates the real environment. Further, the present disclosure discloses the actuation of the virtual robot 402 independently or in synchrony with the robot 108. This configuration provides flexibility to the user to operate the virtual robot 402 on the UE 102 independently emulating the real environment and then operating the robot 108 accordingly in the real environment. This ensures that the operation of the robot 108 is checked virtually, debugging errors, and thus, ensures the safety of the user, once the robot 108 starts operating accordingly. Further, this configuration also provides flexibility to the user to operate the robot 108 in the virtual environment through the virtual robot 402. This configuration reduces error in the operation and also increases the productivity time of the robot 108. This configurations also reduce the possibility of manufacturing error of the robot 108, thus being cost-effective. Further, the configuration disclosed in the present disclosure also enhances safety of the user. This configuration provides the graphical programming interface which provides access to the virtual robot 402 in the virtual environment and the input simultaneously on the display unit 106. This configuration ensures that the user does not have to switch between the virtual robot in the virtual environment and the functional block programming, thus, providing comfort to the user and eliminates possibility of the error in the operation, thus increases productivity and ensures the safety of the user.
[0055] While specific language has been used to describe the present disclosure, any limitations arising on account thereto, are not intended. As would be apparent to a person in the art, various working modifications may be made to the method 600 in order to implement the inventive concept as taught herein. The drawings and the foregoing description give examples of embodiments. Those skilled in the art will appreciate that one or more of the described elements may well be combined into a single functional element. Alternatively, certain elements may be split into multiple functional elements. Elements from one embodiment may be added to another embodiment. , Claims:WE CLAIM:
1. A robotic system (104) comprising:
a control unit (202) configured to:
receive a set of first parameters corresponding to a real environment, wherein the set of first parameters includes at least one of a physical parameter associated with dimensions of the real environment, a spatial location and orientation of a workspace, and a spatial information of the workspace;
receive a set of second parameters associated with an operation of a robot (108), wherein the set of second parameters includes at least one of an end-of-arm tool integrated with the robot (108), an operation instruction of the robot (108), a type of the robot (108), digital and analog signals from external devices, a motion of the robot (108), and actions of the robot (108);
generate a virtual environment having a virtual robot (402) based on the received set of first parameters to emulate the real environment; and
actuate the virtual robot to simulate the operation of the robot,
wherein at least the virtual robot (402) actuates independently, or the robot (108) and the virtual robot (402) actuate in synchrony based on the simulation, and
at least the virtual robot (402) actuates independently, depending on the set of second parameters or the robot (108) and the virtual robot (402) actuates, in synchrony, depending on the set of second parameters.
2. The robotic system (104) as claimed in claim 1, wherein the control unit (202) is adapted to actuate the robot (108) in the real environment in synchrony with the virtual robot (402) in the virtual environment, wherein the robot (108) is a real robot.
3. The robotic system (104) as claimed in claim 1, wherein the control unit (202) is coupled to a repository (216) configured to store a plurality of models of the workspace.
4. The robotic system (104) as claimed in claim 1, wherein the virtual robot (402) is based on predetermined data stored in a repository (216) coupled to the control unit (202), wherein the virtual robot (402) is adapted to mimic the operation of the robot (108) in the virtual environment.
5. The robotic system (104) as claimed in claim 1, wherein the robot (108) is actuated in the real environment to execute operation based on the simulation of the operation of the robot (108) in the virtual environment depending on the set of first parameters and the set of second parameters.
6. The robotic system (104) as claimed in claim 1, wherein the end-of-arm tool is attached to a distal end of an arm of the robot (108), and the end-of-arm tool is adapted to assist the operation of the robot (108) in the real environment.
7. The robotic system (104) as claimed in claim 6, wherein the robot (102) is a real robot comprises:
a plurality of links (302); and
a plurality of joints (304) serially connected with each other with the plurality of links (302);
wherein the plurality of joints (304) includes a plurality of sensors, a drive controller, and a plurality of actuators, wherein the plurality of sensors, the drive controller, and the plurality of actuators communicate a value indicative of the set of first parameters and the set of second parameters to the virtual robot (402) through the control unit (202).
8. The robotic system (104) as claimed in claim 7, wherein the real robot (108) is adapted to generate the set of first parameters and the set of second parameters by one of a static mapping and a dynamic mapping, wherein
the static mapping includes defining the spatial location and orientation of the workspace of the real environment by moving an arm of the real robot (108) by a user input over the workspace, wherein the real robot (108) determines the spatial location and orientation of the workspace and communicates as one of the set of first parameters and the set of second parameters to a user equipment (102) through the control unit (202) to generate the virtual environment having the virtual robot (402); and
the dynamic mapping includes defining the spatial location and orientation of the workspace having a machine-readable code, wherein the real robot (108) detects the machine readable code and communicates as the one of the set of first parameters and the set of second parameters to the user equipment (102) through the control unit (202) to form the virtual environment having the virtual robot (402).
9. The robotic system (104) as claimed in claim 1, wherein the virtual environment having the virtual robot (402) is further developed by an input provided through a user equipment (102) (UE), wherein the input is a set of third parameters including one or more of a functional block programming, cartesian co-ordinates, jog interface, defining instruction for operation of the robot (108).
10. The robotic system (104) as claimed in claim 9, wherein the virtual robot (402) is further actuated by the input provided in the UE (102).
11. The robotic system (104) as claimed in claim 10, wherein the control unit (202) receives a value associated with the actuation of the virtual robot (402) in the virtual environment based on the set of second parameters and the input and is configured to:
communicate the value to the robot (108) in the real environment to actuate the robot (108) in the real environment.
12. The robotic system (108) as claimed in claim 11, wherein the user equipment (102) comprises a display unit (106) configured to display a graphical programming interface, wherein the graphical programming interface provides access to the virtual robot (402) in the virtual environment and the input simultaneously on the display unit (106).
13. A method (600) to operate the robotic system, comprises:
receiving (602), by a control unit (202), a set of first parameters from a robot (108) corresponding to a real environment, wherein the set of first parameters includes at least one of a physical parameter associated with dimensions of the real environment, a spatial location and orientation of a workspace, and a spatial information of the workspace;
receiving (604), by the control unit (202), a set of second parameters associated with an operation of the robot (108) in the real environment, wherein the set of second parameters includes at least one of an end-of-arm tool integrated with the robot (108), an operation instruction of the robot (108), a type of the robot (108), digital and analog signals from external devices, a motion of the robot (108), and actions of the robot (108);
generating (606), by the control unit (202), a virtual environment having a virtual robot (402) based on the received set of first parameters to emulate the real environment; and
actuating (608), by the control unit (202), the virtual robot (402) to simulate the robot (108),
wherein at least the virtual robot (402) actuates independently, or the robot (108) and the virtual robot (402) actuates in synchrony based on the simulation, and
at least the virtual robot (402) actuates independently, depending on the set of second parameters or the robot (108) and the virtual robot (402) actuates, depending on the set of second parameters.
| # | Name | Date |
|---|---|---|
| 1 | 202441005059-TRANSLATIOIN OF PRIOIRTY DOCUMENTS ETC. [24-01-2024(online)].pdf | 2024-01-24 |
| 2 | 202441005059-STATEMENT OF UNDERTAKING (FORM 3) [24-01-2024(online)].pdf | 2024-01-24 |
| 3 | 202441005059-FORM FOR STARTUP [24-01-2024(online)].pdf | 2024-01-24 |
| 4 | 202441005059-FORM FOR SMALL ENTITY(FORM-28) [24-01-2024(online)].pdf | 2024-01-24 |
| 5 | 202441005059-FORM 1 [24-01-2024(online)].pdf | 2024-01-24 |
| 6 | 202441005059-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [24-01-2024(online)].pdf | 2024-01-24 |
| 7 | 202441005059-EVIDENCE FOR REGISTRATION UNDER SSI [24-01-2024(online)].pdf | 2024-01-24 |
| 8 | 202441005059-DRAWINGS [24-01-2024(online)].pdf | 2024-01-24 |
| 9 | 202441005059-DECLARATION OF INVENTORSHIP (FORM 5) [24-01-2024(online)].pdf | 2024-01-24 |
| 10 | 202441005059-COMPLETE SPECIFICATION [24-01-2024(online)].pdf | 2024-01-24 |
| 11 | 202441005059-Proof of Right [23-04-2024(online)].pdf | 2024-04-23 |
| 12 | 202441005059-FORM-26 [23-04-2024(online)].pdf | 2024-04-23 |
| 13 | 202441005059-Request Letter-Correspondence [19-02-2025(online)].pdf | 2025-02-19 |
| 14 | 202441005059-Covering Letter [19-02-2025(online)].pdf | 2025-02-19 |
| 15 | 202441005059-FORM 3 [02-07-2025(online)].pdf | 2025-07-02 |