Abstract: METHOD AND SYSTEM OF PERFORMING A PREDEFINED ROBOTIC OPERATION ON A VEHICLE BODY ABSTRACT A method (500) and system (100) of performing a predefined robotic operation on a vehicle body (126) is provided. A computing device (102) receives one or more distance parameters from one or more laser sensors (402) placed along a length of the vehicle body (126) while the vehicle body (126) is moving on an assembly line (120). A model of the vehicle body (126) is determined based on the one or more distance parameters and a reference table. The reference table includes one or more distance parameters corresponding to a plurality of predefined models. The model of the vehicle body (126) is validated based on a manual input received from a user (118). The computing device (102) enables one or more robotic arms (402) to perform the predefined operation on the vehicle body (126) based on the model of the vehicle body (126) and the validation. [To be published with FIG. 1]
Description:DESCRIPTION
Technical Field
[001] This disclosure relates generally to detection systems in an assembly line, and more particularly to a system and a method of detection of automobile parts in an assembly line.
BACKGROUND
[002] Automobiles of various shapes and sizes may be manufactured using a single assembly line. The manufacturing assembly line may include a robot to perform one of more operations, such as a body painting operation. The arms of the robot (also referred to as robotic arms) may be operated based on inputs received from an operator about a type of vehicle body. The operator may provide such an input based on a visual inspection of the vehicle body, thereby enabling the robot to perform the operation based on the vehicle body design parameters pre-defined for a particular type of vehicle body.
[003] Since visual inspection of the vehicle body by an operator is an error-prone exercise, there is a likelihood of false input of the type of vehicle body by the operator. Such a false input may lead to operation of robotic arms using incorrect vehicle body design parameters, which in turn may lead to collision of the robot with the vehicle body. The collision may cause damage to the vehicle body and the robot both. Also, in the case of a painting operation, a collision may lead to wastage of paint applied to the vehicle body.
SUMMARY OF THE INVENTION
[004] In an embodiment, a method of performing a predefined robotic operation on a vehicle body is provided. The method may include receiving, by a computing device, one or more distance parameters from one or more laser sensors placed along a length of the vehicle body while the vehicle body may be moving on an assembly line. The method may include determining, by the computing device, a model of the vehicle body based on the one or more distance parameters and a reference table. It is to be noted that one or more distance parameters may correspond to each of a plurality of predefined models. The method may further include validating, by the computing device, the model of the vehicle body based on a manual input received from a user. Further, the method may include enabling, by the computing device, one or more robotic arms to perform the predefined operation on the vehicle body based on the model of the vehicle body and the validation.
[005] In another embodiment, a system for performing a predefined robotic operation on a vehicle body is provided. The system may include a processor and a memory in a computing device. The memory may store a plurality of processor-executable instructions, which upon execution by the processor, may cause the processor to receive one or more distance parameters from one or more laser sensors placed along a length of the vehicle body while the vehicle body may be moving on an assembly line. The processor may further determine a model of the vehicle body based on the one or more distance parameters and a reference table. It is to be noted that one or more distance parameters may correspond to each of a plurality of predefined models. The processor may validate the model of the vehicle body based on a manual input received from a user. Further, the processor may be configured to enable one or more robotic arms to perform the predefined operation on the vehicle body based on the model of the vehicle body and the validation.
BRIEF DESCRIPTION OF THE DRAWINGS
[006] The accompanying drawings, which are incorporated in and constitute a part of this disclosure, illustrate exemplary embodiments and, together with the description, serve to explain the disclosed principles.
[007] FIG. 1 is a block diagram of an assembly line optimization system for optimizing an assembly line, in accordance with an embodiment of the present disclosure.
[008] FIG. 2A and FIG. 2B, illustrate exemplary scenarios of determination of distance parameters in assembly line, in accordance with an embodiment of the present disclosure.
[009] FIG. 3 is a functional block diagram of the optimization device, in accordance with an embodiment of the present disclosure.
[010] FIG. 4 is the digital assembly line model depicting a robotic operation, in accordance with an embodiment of the present disclosure.
[011] FIG. 5 is a flowchart of a method of optimizing an assembly line, in accordance with an embodiment of the present disclosure.
DETAILED DESCRIPTION OF THE DRAWINGS
[012] Exemplary embodiments are described with reference to the accompanying drawings. Wherever convenient, the same reference numbers are used throughout the drawings to refer to the same or like parts. While examples and features of disclosed principles are described herein, modifications, adaptations, and other implementations are possible without departing from the scope of the disclosed embodiments. It is intended that the following detailed description be considered as exemplary only, with the true scope being indicated by the following claims. Additional illustrative embodiments are listed.
[013] Presently, assembly lines used in automobile manufacturing industry may be electrically and manually controlled using mechanical and sensor-based mechanisms. Using the assembly lines, multiple types of operations related to automobile manufacturing may be performed. In some embodiments, these operations may be performed by one or more robots. One such robotic operation may be painting of the vehicle body. For performing this operation, one or more robotic arms may be provided at a painting booth in the assembly line. The movement of the robotic arms may be enabled in accordance with the dimensions of the vehicle body, so that the robotic arms may move close to the vehicle body but not contact the body.
[014] The model of a vehicle on which the robotic arms are to perform an operation may be determined by an operator and fed as an input to a control unit. The operator may determine the model based on a visual inspection. In addition, the operator may also select a variant of the particular model, so that operations corresponding to the particular variant, such as painting with a colour specific to that variant, are carried out. In case there is an error in model selection by the operator, the robotic arms may not be configured as per the dimensions of the vehicle body. This may cause the robotic arms to collide with the vehicle body and cause damage to the vehicle body and/or the robots itself may get damaged. Therefore, there is a requirement for accurately determining a model of the vehicle body before the vehicle body is sent to the painting booth of the assembly line in order to avoid any accidents or collisions of the robotic arms with the vehicle body while performing the one or more robotic operations. Therefore, the present invention provides a method and system for performing a robotic operation on a vehicle body based on determination and validation of a model of the vehicle body.
[015] Referring now to FIG. 1, a block diagram indicating a network implementation of a system 100 for performing a predefined robotic operation on a vehicle body is illustrated, in accordance with an embodiment of the present disclosure. By way of an example, an automobile manufacturing assembly line 120 may include various machineries and conveyors installed in a workshop of a manufacturing unit or a production unit. In an embodiment, the assembly line 120 may be divided into various sections or booths for performing one or more operations. A vehicle body 126 may be moved from one booth to another on the assembly line after completion of an operation on the vehicle body in the booth. In an embodiment, one such booth may be utilized for performing a painting operation. As shown in FIG. 1, a booth 121 of an assembly line 120 is shown having one or more conveyors 124 for carrying a vehicle body 126. In an embodiment, the booth 121 may be located before a booth (not shown) having one or more robots for performing one or more operations such as a painting operation. Further, the vehicle body 126 may be moving from one of the preceding booth 121 to the booth having one or more robots via the one or more conveyors 124.
[016] The system 100 may include a computing device 102. By way of an example, the computing device 102 may be implemented as any computing device which may be configured or operatively connected with a control unit 114. Further, one or more users 118 may communicate with the system 100 through one or more user input/output devices 106 provided in the computing device 102. The computing device 102 may be communicatively coupled to the control unit 114 through a wireless or wired communication network 112. In an embodiment, the user 118 may be a supervisor or an operator of the booth 121 in the assembly line 120. In an embodiment, computing device 102 can include a variety of computing systems, including but not limited to, a laptop computer, a desktop computer, a notebook, a workstation, a portable computer, a personal digital assistant, a handheld or a mobile device.
[017] The memory 110 may store instructions that, when executed by the processor 108, cause the processor 108 to control a predefined robotic operation on the vehicle body 126, as discussed in greater detail below. The memory 110 may be a non-volatile memory or a volatile memory. Examples of non-volatile memory may include, but are not limited to a flash memory, a Read Only Memory (ROM), a Programmable ROM (PROM), Erasable PROM (EPROM), and Electrically EPROM (EEPROM) memory. Examples of volatile memory may include but are not limited to Dynamic Random Access Memory (DRAM), and Static Random-Access memory (SRAM). The memory 110 may also store various assembly line operational parameters (e.g. section information, section parameters such as time information, emergency parameters, section parameters, speed parameters, status parameters, etc.) that may be captured, processed, and/or required by the system 100.
[018] In an embodiment, the communication network 112 may be a wired or a wireless network or a combination thereof. The network 112 can be implemented as one of the different types of networks, such as Common Industrial Protocol (CIP) network, DeviceNet network, ethernetIP network, intranet, local area network (LAN), wide area network (WAN), the internet, Wi-Fi, LTE network, CDMA network, 5G, and the like. Further, the network 106 can either be a dedicated network or a shared network. The shared network represents an association of the different types of networks that use a variety of protocols, for example, Common Industrial Protocol (CIP), Open Platform Communication (OPC) protocols, Hypertext Transfer Protocol (HTTP), Transmission Control Protocol/Internet Protocol (TCP/IP), Wireless Application Protocol (WAP), and the like, to communicate with one another. Further the network 112 can include a variety of network devices, including routers, bridges, servers, computing devices, storage devices, and the like.
[019] In an embodiment, the control unit 114 may be connected to various industrial controllers (not shown) which may be deployed throughout the booth 120 of the assembly line to monitor and control respective components of the booth 121 of the assembly line 120. In an embodiment, the industrial controllers may execute one or more control algorithms to facilitate monitoring and control of the components such as, but not limited to, machines, workstation machines, conveyor systems of the assembly line.
[020] According to the current disclosure, one or more sensors 122 may be provided along the sides of the booth 121 of the assembly line 120 such that the sensor 122 may detect one or more distance parameters of the vehicle body 126. The one or more distance parameters of the vehicle body 126 detected by the sensors 122 may be transmitted to the control unit 114. In an embodiment, the control unit 114 may include one or more industrial controllers which may include software executable controllers which may be implemented on hardware platform or a hybrid device that combines controller functionality such as controlling the movement of conveyors 124, etc. The control software or algorithms executed by industrial controllers may include coding or algorithm to process input signal read from the industrial devices or components used in an assembly line. Industrial devices may include both input devices that input data for controlling the industrial devices which form part of the booth 121 of the assembly line 120. Examples of such input devices may include, but are not limited to, sensors 122 like laser sensors, proximity sensors, telemetry devices (such as temperature sensors, flow meters, level sensors, pressure sensors, light sensors, etc.), manual operator control devices (e.g., push buttons, selector switches, etc.), safety monitoring devices, and other such devices. Industrial devices may also include output devices such as but not limited to motor drives, pneumatic actuators, signalling, devices robot control inputs, valves, hydraulics machines, etc.
[021] In an embodiment, the booth 121 of the assembly line 120 may be configured to perform one or more processes that relate to automobile manufacturing, processing, material handling, etc. In an embodiment, a paint application booth (not shown) may be provided after one or more preceding booths 120 for performing a robotic painting operation. In an exemplary embodiment, industrial controllers may be operated with hardwired inputs and outputs that communicate with the industrial devices forming the components of the assembly line and/or the paint application booth to effect control of the devices such as one or more robotic arms enabled to perform painting operation on the vehicle body 126. In an embodiment, the computing device 102 and the control unit 114 may be integrated into the robots or the robotic arms in order for it to perform one or more robotic operations.
[022] The controller I/O can include digital I/O that may be transmitted and received as discrete voltage signals to and from the industrial devices, or analog I/O that transmits and receives analog voltage or current signals to and from the devices. The controller I/O can be received by the control unit 114 which may then be processed to covert from analog to digital or digital to analog signals in order to be read into and controlled by the control programs or the components using one or more analog to digital convertors or digital signal processing algorithms. In an embodiment, the control unit 114 may transmit the signals to the computing device 102, which in turn may save the data in a memory 110. In an embodiment, the distance parameters detected by the sensor 122 corresponding to the vehicle body 126 may be transmitted to the computing unit 102 via the control unit 114 for validation.
[023] In an embodiment, the booth 121 of the assembly line 120 may also include one or more laser sensors (not shown) placed along a side wall of the booth 121 of the assembly line 120 or the length of the conveyor 124 such that it may detect the vehicle body 126 moving on the conveyor 124. In an embodiment, one or more proximity sensors (not shown) may be provided on a roller bed of the conveyor 124 of the booth 121 of the assembly line 120. The proximity sensors may be disposed at predefined reference points on the roller bed of the conveyor 124. The one or more laser sensors may detect the distance parameters of the vehicle body 126 and transmit them to the control unit 114.
[024] The one or more laser sensors may be activated based on detection of the vehicle body 126 by the proximity sensors while the vehicle body 126 is moving on the conveyor 124. In an embodiment, the one or more proximity sensors may be provided on a roller bed of the conveyor 124 at predefined reference points. The control unit 114 may receive the one or more distance parameters from the one or more laser sensors 122. In an embodiment, the one or more distance parameters may be indicative of a width of the vehicle body 126 based on the activation of the proximity sensors at each of one or more predefined reference points along the length of the vehicle body 126.
[025] In an embodiment, the computing device 102 may determine one or more dimensions the vehicle body 126 based on the one or more distance parameters. In an embodiment, the computing device 102 may determine one or more dimensions of vehicle body 126 at each of one or more predefined reference points based on the distance parameters determined by the one or more laser sensor. In an embodiment, the laser sensor may be activated based on detection of vehicle body 126 by the one or more proximity sensors placed at each of the one or more predefined reference points. Accordingly, distance parameters of the vehicle body 126 may be determined for various portions of the vehicle body 126 which may be in front of the laser sensor as the vehicle body 126 is detected by the proximity sensors when moving on the conveyor 124.
[026] In an embodiment, the computing device 102 may determine one or more dimensions of vehicle body 126 such as, but not limited to, a width of the vehicle body 126 at the one or more predefined reference points. Further, the computing device 102 may determine length of the vehicle body 126 based on detection of the vehicle body 126 by the one or more proximity sensors.
[027] The computing device 102 may determine model of the vehicle body 126 on the assembly line based on the one or distance parameters and a reference table. In an embodiment, the computing device 102may store in the memory 110 a predefined reference table including one or more distance parameters corresponding to each of a plurality of predefined models. In an embodiment, the reference table may include a list of predefined models or model types and their corresponding one or more distance parameters or the one or more dimensions of vehicle body 126 at each of one or more predefined reference points. The computing device 102 may determine a computing device inferred model information of the vehicle body 126 based on a look up of the one or more distance parameters or the one or more dimensions of vehicle body 126, determined using the sensors 122 such as, but not limited to, laser sensor(s) and/or proximity sensor(s), in the reference table.
[028] The computing device 120 may receive input from the user 118 regarding the model information of the vehicle body 126 on the assembly line 120. In an embodiment, the user 118 may input the model information based on a visual inspection of the vehicle body 126 moving on the conveyor 124 of the assembly line 120.
[029] The validation unit 103 of the computing device 120 may validate the model of the vehicle body 126 based on the manual input received from the user 118. The computing device 102 may compare the manual input received from the user 118 with the model information of the vehicle body 126 inferred by the computing device 102. In case the manual input received from the user 118 is determined to be same as that of the computing device inferred model information of the vehicle body 126, the computing device 102 may enable one or more robotic arms to perform the predefined robotic operation such as, but not limited to, painting of the vehicle body 126 based on the model of the vehicle body and the validation. Further, in case the manual input received from the user 118 is not determined to be same as that of the model information of the vehicle body 126 inferred by the computing device 102, the computing device 102 may sound an alarm or stop the conveyors to prevent the vehicle body 126 to move to the booth 121 of the assembly line 120 for performing the painting operation. Accordingly, based on the comparison the computing device 102 may ensure that the one or more robotic arms (not shown) are enabled or adjusted based on the model of the vehicle body 126 in order to prevent any collision between the vehicle body 126 and the robotic arms.
[030] In an embodiment, the computing device 102 may provide a digital simulation of the assembly line 120 based on the data and information, and real-time synchronization of real-time data of the assembly line 120, monitored data of the control unit 114 and the sensors 122. Accordingly, using the simulation device 104 may visualize present states of the sections or booths 121 of the assembly line 120 including industrial systems or their associated devices using graphical representations of the processes that display metered or calculated values, employ color or position animations based on state, render alarm notifications, or employ other such techniques for presenting relevant data to the user 118. Data presented in this manner is read from industrial controllers by control unit 114 and presented on one or more of the display screens of input/output device 106 according to preferred or selected display formats chosen by the operator (i.e. user 118).
[031] The control unit 114 and the computing device 102 allow the user 118 to view relevant data values, alarms and statuses associated with the various machines and devices in a safe and remote manner. This prevents or avoids the requirements of the user 118 to be physically present near machines in order to view operational and status information of the machines of the assembly line 120.
[032] In an embodiment, the computing device 102 may determine the dimensions of the vehicle body 126 based on the determination of the one or more distance parameters by the sensors 122. The user 118 may be notified based on an audio-visual alarm in case the model of the vehicle body 126 inputted by the user 118 and the model of the vehicle body 126 inferred by the computing device 102 are determined to be not same.
[033] In an embodiment, the digital simulation of the assembly line may be in form of two or three-dimensional views. In an embodiment, the digital simulation may include creation of augmented reality or virtual reality (AR/VR) presentation based on the videos captured by a camera. In an embodiment, the simulation may provide various views of the assembly line 120 in real time depicting various parameters of the assembly line such as status information of the sections of the assembly line, speed information of the processing of each section and the workstations and the conveyors of the assembly line 120.
[034] Referring now to FIG. 2A and FIG. 2B, exemplary scenarios 200A, 200B, respectively, of determination of distance parameters in assembly line are illustrated, in accordance with an embodiment of the present disclosure. A laser sensor 202 is provided along a side wall of, or transversely to, the booth 120 of the assembly line or along the length of the conveyor 210 such that it may detect the vehicle body 126 while the vehicle body 126 may be moving on the conveyor 210. In an embodiment, the vehicle body 126 may be placed on a skid pallet 206. In an embodiment, one or more proximity sensors 208a, 208b and 208c may be provided on a roller bed 216 of the conveyor 210. As shown in FIG. 2A, the proximity sensors 208a, 208b and 208c are provided at pre-defined reference points A, B and C. Accordingly, as the vehicle body 126 moves on the conveyor belt 210, the proximity sensors 208a, 208b and 208c may be activated based on the detection of the vehicle body 126. As the vehicle body 126 moves on the conveyor 210, a range of detection of laser beam from the laser sensor 202 may be determined at each of the predefined reference point A, B and C.
[035] In the exemplary scenario of 200A, the vehicle body 126 is at a position when all three proximity sensors 208a, 208b and 208c sense the presence of the vehicle body 126. Based on the detection by the three sensors, the control unit 114 may activate the laser sensor 202. Since no part of the vehicle body 126 obstructs the laser beam 204, the output of the laser sensor 202 may depict “no detection range” or an “infinite range”. Accordingly, the computing device 102 may determine that the length of the vehicle body 126 is not long enough to reflect the laser from the laser sensor 202 when the vehicle body 126 is detected by all the three laser sensors 208a, 208b and 208c. Accordingly, based on the exemplary reference table “Table 1” provided below, the model of the vehicle body 126 may be determined as “Model A”.
TABLE 1
[036] FIG. 2B depicts an exemplary scenario 200B in which the range of detection of laser 204 from the laser sensor 202 may be determined as ‘y’. Further, the ‘y’ may be determined to be in a range of “415 mm to 425 mm” when the sensor output is “ON” and when the vehicle body 126 is detected by all three proximity sensors 208a, 208b and 208c. Accordingly, laser sensor 202 may be activated based on detection of the vehicle body 126 by all three proximity sensors 208a, 208b and 208c. Table 1 above depicts an exemplary reference table. Based on the Table 1, the model of the vehicle body 126 may be determined as “Model B” for the range of detection of the of laser 204 ‘y’ which is in the range of 415 mm to 425 mm. Accordingly, the computing device 102 may distinguish between different models of the vehicle body 126 based on the value of range of detection the laser 204 when the vehicle body 126 may be detected simultaneously by all the proximity sensors 208a, 208b and 208c provided at predefined reference points A, B and C respectively. In an embodiment, the proximity sensors may be placed at predefined reference points in accordance with the design specification of various models of the vehicle body 126.
In an embodiment, the proximity sensors 208a, 208b and 208c may be placed at predefined reference points such that it may enable differentiation of one model of the vehicle body from another based on detection of range of detection of laser 204 when the vehicle body 126 is simultaneously detected by each of the proximity sensors 208a, 208b and 208c. Accordingly, model of the vehicle body 126 may be determined based on a lookup of the range of detection of laser 204 from the reference table as depicted in Table 1.
[037] In another embodiment, two or more models of the vehicle body 126 having same length may be differentiated based on determination of range of detection of laser 204 at one or more predefined reference points. Accordingly, the computing unit 102 may determine a model of the vehicle body 126 based on a lookup of the range of detection of laser 204 in predefined reference table at each of the one or more predefined reference points. In an embodiment, the reference table may include the range of detection of laser 204 for the one or more predefined reference points for each of a plurality of models of the vehicle body.
[038] It is to be noted that FIGs. 2A and 2B, depict an exemplary scenario in which laser sensor 202 is activated only when all three proximity sensors 208a, 208b and 208c are activated. However, there may be scenarios when laser sensor 202 may be configured to be activated when the vehicle body 126 is detected by each of the proximity sensors 208a, 208b and 208c while moving on the conveyor 124. Such activation of laser sensor 202 may enable the computing unit 102 to determine a range of detection of the vehicle body 126 at each of predefined reference points A, B, C. Accordingly, the computing unit 102 may determine distance parameters of the vehicle body 126 in front of the laser sensor 202 when the vehicle body 126 is detected by the proximity sensors 208a, 208b and 208c at each of the predefined reference points while moving on the conveyor 124. Accordingly, distance parameters may be determined for different portions of the vehicle body 126 moving on the conveyor 124 as the vehicle body 126 is detected by each of the proximity sensors 208a, 208b and 208c. Based on distance parameters determined by laser sensor 202 for different portions of the vehicle body 126, width information may be determined by the computing device 102 for each of the different portions of the vehicle body 126. For e.g., when the vehicle body 126 reaches reference point A, the proximity sensor 208a may be activated and the laser sensor 202 may be activated. Accordingly, laser beam 204 may hit the front portion of the vehicle body 126. Accordingly, the computing unit 102 may determine width information of a front region of the vehicle body 126. As the vehicle body 126 may move further, the proximity sensors 208a and 208b at reference points A and B may be activated which may in turn activate the laser sensor 202. Accordingly, the laser beam 204 at point A may hit at a middle portion of the vehicle body 126. Accordingly, the width information of the middle portion of the vehicle body 126 may be determined by the computing device 102. Similarly, when the vehicle body 126 moves further on the conveyor 210, the third proximity sensor 208c at reference point C may be activated along with the proximity sensors 208a and 208b at reference points A and B. Accordingly, the laser sensor 202 may be activated to transmit the laser beam 204 at point A that may hit at a rear portion of the vehicle body 126 and may provide the width information of the rear portion of the vehicle body 126.
[039] Accordingly, the computing unit 102 may determine a unique model of the vehicle body 126 based on the width information of the vehicle body determined for the front, middle and rear portion of the vehicle body 126 based on activation of the laser sensor 202 at each of the reference points A, B and C. In an embodiment, a predefined lookup table or a reference table may be used by the computing unit 102 that may have a list of plurality of models and their corresponding width information at each of the predefined reference points in order to determine the model of the vehicle body 126.
[040] Referring now to FIG. 3, a functional block diagram 300 of the computing device 102 is illustrated, in accordance with an embodiment of the present disclosure. In some embodiments, the computing device 102 may include a parameter detection module 302, a validation module 304, a customization module 306, and other modules 308.
[041] The parameter detection module 302 may determine one or more dimensions of the vehicle body 126 based on determination of the one or more distance parameters from the sensors 122. The detection of one or more distance parameters by the sensors 122 is depicted and described in detail in FIG. 2A and FIG. 2B above. Accordingly, based on the output of the laser sensor 202 at various pre-defined reference points the model may be determined based on a lookup from the Table 1 provided above. In an embodiment, one or more dimensions of the vehicle body 126 may be determined based on the one or more distance parameters. In an embodiment, the one or more dimensions determined may be indicative of the width of the vehicle body 126. In an embodiment, length of the vehicle body 126 may be determined based on the detection of the vehicle body 126 by the one or more proximity sensors 208a, 208b, 208c and/or the one or more laser sensor 202. In an embodiment, the length of the vehicle body 126 may be determined based on speed of the conveyor and the time of detection of the first end and the second end of the vehicle body 126 by the one or more proximity sensors 208a, 208b, 208c and/or the one or more laser sensor 202.
[042] The validation module 304 may output a validation a validation result if the model determined by the computing device 102 based on the one or more distance parameters determined by the parameter determination module 302 is same as a user input received model information. In case the user input received model information is same as the model determined by the computing unit 102 based on the one or more distance parameters determined by the parameter determination module 302, validation module 304 may provide an output to the customization module 306 which in turn may customize one or more robotic arms of the succeeding booths of the assembly line 120 in accordance with the model of the vehicle body 126. In case the user input received model information is not same as the model determined by the computing unit 102 based on the one or more distance parameters determined by the parameter determination module 302, the validation module 304 may generate an error and output an audio and/or visual warning depicting a mismatch. Further, the conveyor may be stopped to prevent the vehicle body from moving to subsequent processing booths of the assembly line 120 to prevent any accidents.
[043] The customization module 306 may customize one or more robots in the assembly line 120 based on determination of the model by the parameter detection module 302 and the validation result from the validation module 304. In an embodiment, one or more robotic arms in the assembly line may be configured to perform a predefined robotic operation, such as, but not limited to, painting operation. In an embodiment, the painting operation may include application of primer to the vehicle body 126, application of base coat, and application of clear coat. In an embodiment, application of primer, application of base coat, and application of clear coat performed in three separate booths of the assembly line 120. In an embodiment, the booth for application of primer may include, but not limited to, two or more robotic arms. In an embodiment, the booth for application of base coat may include, but not limited to, four or more robotic arms. In an embodiment, the booth for application of clear coat may include, but not limited to, four or more robotic arms. In an embodiment, the robotic arms may be configured to be operated using 70kv to perform the painting operations. In an embodiment, the robotic arms may be configured to apply different color paint to the vehicle body 126 based on an input received from the user 118 or the computing unit 102. Further, the robotic arms may be configured to perform painting operation based on model of the vehicle body.
[044] FIG. 4 is the digital assembly line model 400 depicting a robotic operation, in accordance with an embodiment of the present disclosure. The simulated digital assembly line model 400 is illustrated, in accordance with an embodiment. In an exemplary embodiment, the digital assembly line model 400 depicts robotic arms 402a and 402b (also referred as 402) performing one or more robotic operations. In an embodiment, the simulation may depict that the robotic arms 402a and 402b have been enabled to perform the robotic operation by being adjusted at a collision free position. Further, the simulated digital assembly line model 400 may depict the simulation of the vehicle body 404 based on the distance parameters. Further, the simulated digital assembly line model 400 may depict various parameters received by the computing device 102 associated to the assembly lines 120 from the control unit 114. In an embodiment, the body type and model information may also be depicted. Further, the model may be depicted as validated or unvalidated based on the validation result received from the validation module 304.
[045] Further, the simulated digital assembly line model 400 may be determined based on real time data received from the control unit 114 such as, but not limited to, measured or calculated values representing one or more dimensions of the vehicle body 404.
[046] Referring now to FIG. 5, a method 500 of performing a predefined robotic operation on a vehicle body is disclosed, in accordance with an embodiment of the present disclosure. FIG. 5 is explained in conjunction with FIGs. 1, 2A, 2B, 3 and 4. Each step of the flowchart 500 may be executed by various modules (same as the modules of the system 100).
[047] At step 502, one or more distance parameters from one or more laser sensors placed along a length of the vehicle body while the vehicle body is moving on an assembly line are received by the computing device 102.
[048] At step 504, the computing device 102 may determine a model of the vehicle body based on the one or more distance parameters and a reference table. In an embodiment, the reference table may include one or more distance parameters corresponding to each of a plurality of predefined models.
[049] At step 506, the computing device 102 may validate the model of the vehicle body based on a manual input received from the user.
[050] At step 508, the computing device 102 may enable one or more robotic arms to perform the predefined robotic operation on the vehicle body based on the model of the vehicle body and the validation.
[051] It is intended that the disclosure and examples be considered as exemplary only, with a true scope of disclosed embodiments being indicated by the following claims.
, Claims:CLAIMS
I/We claim:
1. A method (500) of performing a predefined robotic operation on a vehicle body (126), the method (500) comprising:
receiving (502), by a computing device (102), one or more distance parameters from one or more laser sensors (202) placed along a length of the vehicle body (126) while the vehicle body (126) is moving on an assembly line (120);
determining (504), by the computing device (102), a model of the vehicle body (126) based on the one or more distance parameters and a reference table,
wherein the reference table comprises one or more distance parameters corresponding to each of a plurality of predefined models;
validating (506), by the computing device (102), the model of the vehicle body (126) based on a manual input received from a user (118); and
enabling, by the computing device (102), one or more robotic arms (402) to perform the predefined robotic operation on the vehicle body (126) based on the model of the vehicle body (126) and the validation.
2. The method (500) as claimed in claim 1, comprising:
activating, by the computing device (102), the one or more laser sensors (202) based on detection of the vehicle body (126) by one or more proximity sensors (208a, 208b, 208c) while the vehicle body (126) is moving on the assembly line (120),
wherein the one or more proximity sensors (208a, 208b, 208c) are coupled to a roller bed (216) of the assembly line (120).
3. The method (500) as claimed in claim 1, wherein the one or more distance parameters received from the one or more laser sensors are indicative of a width of the vehicle body (126) at each of one or more predefined reference points along the length of the vehicle body (126).
4. The method (500) as claimed in claim 1, comprising determining, by the computing device (102), one or more dimensions of the vehicle body (126) based on the one or more distance parameters.
5. The method (500) as claimed in claim 1, wherein the predefined robotic operation comprises painting of the vehicle body (126).
6. A system (100) for performing a predefined robotic operation on a vehicle body (126), comprising:
a processor (108); and
a memory (110) communicably coupled to the processor (108), wherein the memory (110) stores processor-executable instructions, which, on execution by the processor (108), cause the processor (108) to:
receive one or more distance parameters from one or more laser sensors (202) placed along a length of the vehicle body (126) while the vehicle body (126) is moving on an assembly line (120);
determine a model of the vehicle body (126) based on the one or more distance parameters and a reference table,
wherein the reference table comprises one or more distance parameters corresponding to a plurality of predefined models;
validate the model of the vehicle body (126) based on a manual input received from a user (118); and
enable one or more robotic arms (402) to perform the predefined robotic operation on the vehicle body (126) based on the model of the vehicle body (126) and the validation.
7. The system (100) as claimed in claim 6, wherein the processor (108) is configured to:
activate the one or more laser sensors (202) based on detection of the vehicle body (126) moving by one or more proximity sensors (208a, 208b, 208c) while the vehicle body (126) is moving on the assembly line (120),
wherein the one or more proximity sensors (208a, 208b, 208c) are coupled to a roller bed (216) of the assembly line (120).
8. The system (100) as claimed in claim 6, wherein the one or more distance parameters received from the one or more laser sensors (202) are indicative of a width of the vehicle body (126) at each of one or more predefined reference points along the length of the vehicle body (126).
9. The system (100) as claimed in claim 6, wherein the processor (108) is configured to:
determine one or more dimensions of the vehicle body (126) based on the one or more distance parameters.
10. The system (100) as claimed in claim 6, wherein the predefined robotic operation comprises painting of the vehicle body (126).
| # | Name | Date |
|---|---|---|
| 1 | 202321056923-STATEMENT OF UNDERTAKING (FORM 3) [24-08-2023(online)].pdf | 2023-08-24 |
| 2 | 202321056923-PROOF OF RIGHT [24-08-2023(online)].pdf | 2023-08-24 |
| 3 | 202321056923-FORM 1 [24-08-2023(online)].pdf | 2023-08-24 |
| 4 | 202321056923-FIGURE OF ABSTRACT [24-08-2023(online)].pdf | 2023-08-24 |
| 5 | 202321056923-DRAWINGS [24-08-2023(online)].pdf | 2023-08-24 |
| 6 | 202321056923-DECLARATION OF INVENTORSHIP (FORM 5) [24-08-2023(online)].pdf | 2023-08-24 |
| 7 | 202321056923-COMPLETE SPECIFICATION [24-08-2023(online)].pdf | 2023-08-24 |
| 8 | 202321056923-Proof of Right [28-08-2023(online)].pdf | 2023-08-28 |
| 9 | Abstract.1.jpg | 2024-01-17 |
| 10 | 202321056923-FORM-26 [20-02-2024(online)].pdf | 2024-02-20 |