Abstract: The invention relates to method (700) and system (100) for automatic identification of a primary manufacturing process (PMP) from a three-dimensional (3D) model (202) of a product. The method (700) includes generating (702) a plurality of images corresponding to a plurality of views of the product based on the 3D model (202) of the product; determining (704) a plurality of confidence score vectors, based on the plurality of images, using a first Artificial Neural Network (ANN) model (206); determining (706) an aggregate confidence score vector, representing a pre-defined PMP category with maximum frequency, based on the plurality of confidence score vectors; extracting (708) a set of manufacturing parameters associated with the product, based on the 3D model (202) of the product; and identifying (710) the PMP based on the aggregate confidence score vector and the set of manufacturing parameters, using a second ANN model (214).
Generally, the invention relates to manufacturing processes. More
specifically, the invention relates to a method and system for automatic
identification of a primary manufacturing process (PMP) from a three-dimensional
(3D) model of a product.
BACKGROUND
[002] Typically, application of physical and chemical processes to alter
geometry, properties and appearance of a raw material in order to make parts or
products, may be referred as manufacturing process. The manufacturing process
mainly includes a subtractive manufacturing, a solidification manufacturing, and a
deformation manufacturing. Further, a manufacturing process planning activity
may usually be used as a preparatory step to determine a sequence of operations or
processes needed to convert the raw material into finished product. The
manufacturing process planning activity usually includes a sequence of
manufacturing processes to produce a final product. Moreover, the manufacturing
process planning activity includes a first manufacturing process or a primary
manufacturing process (PMP), and a plurality of subsequent manufacturing
processes or secondary manufacturing processes (SMP). By way of an example,
there may be a requirement for a casted part to undergo a drilling process followed
by other processes such as, deburr, heat treatment and inspection. In that case, the
casting process may be the PMP and all the other processes may be referred to as
SMP.
[003] Traditionally, identifying the PMP is heavily dependent on
knowledge and experience of a manufacturer, and may vary depending on
availability of machines and tools with the manufacturer and their associations with
supplier. Thus, the traditional ways of identifying the PMP without considering the
Docket No: IIP-HCL-P0050-IN1
-3-
cost efficiency or design efficiency of the final product may be inefficient. Today,
various systems are available for identifying the PMP based on a 3D model of a
product. The systems use neural networks models and consider geometrical features
like surface curvatures, volume surface area, visibility, tool accessibility to identify
the manufacturing processes. However, these systems predict results without
considering critical product and manufacturing information such as, material of the
product, production volume. Therefore, results of these systems may be inefficient
and inaccurate.
[004] Therefore, there is a need to develop a system that may utilize the
product and manufacturing information along with extracting visual features of the
product to capture intuition of a user or a manufacturing expert.
SUMMARY
[005] In one embodiment, a method for automatic identification of a
primary manufacturing process (PMP) from a three-dimensional (3D) model of a
product is disclosed. The method may include generating a plurality of images
corresponding to a plurality of views of the product based on the 3D model of the
product. It should be noted that the 3D model may be rotated at a predefined step
angle along an axis of rotation to obtain the plurality of views of the product. The
method may further include determining a plurality of confidence score vectors,
based on the plurality of images, using a first Artificial Neural Network (ANN)
model. The first ANN model may extract a plurality of visual features of the product
from the plurality of images to capture a complexity of the product. Additionally,
each of the plurality of confidence score vector may correspond to a plurality of
pre-defined PMP categories. The method may further include determining an
aggregate confidence score vector, representing a pre-defined PMP category with
maximum frequency, based on the plurality of confidence score vectors. The
method may further include extracting a set of manufacturing parameters associated
with the product, based on the 3D model of the product. The set of manufacturing
parameters may include at least one of a first set of parameters with categorical
Docket No: IIP-HCL-P0050-IN1
-4-
values and a second set of parameters with numerical values. The method may
further include identifying the PMP based on the aggregate confidence score vector
and the set of manufacturing parameters, using a second ANN model. The second
ANN model may capture non-linear dependencies of identification of the PMP.
[006] In another embodiment, a system for automatic identification of a
PMP from a 3D model of a product is disclosed. The system may include a
processor and a memory communicatively coupled to the processor. The memory
may store processor-executable instructions, which, on execution, may cause the
processor to generate a plurality of images corresponding to a plurality of views of
the product based on the 3D model of the product. It should be noted that the 3D
model may be rotated at a predefined step angle along an axis of rotation to obtain
the plurality of views of the product. The processor-executable instructions, on
execution, may further cause the processor to determine a plurality of confidence
score vectors, based on the plurality of images, using a first Artificial Neural
Network (ANN) model. The first ANN model may extract a plurality of visual
features of the product from the plurality of images to capture a complexity of the
product. Additionally, each of the plurality of confidence score vector may
correspond to a plurality of pre-defined PMP categories. The processor-executable
instructions, on execution, may further cause the processor to determine an
aggregate confidence score vector, representing a pre-defined PMP category with
maximum frequency, based on the plurality of confidence score vectors. The
processor-executable instructions, on execution, may further cause the processor to
extract a set of manufacturing parameters associated with the product, based on the
3D model of the product. The set of manufacturing parameters may include at least
one of a first set of parameters with categorical values and a second set of
parameters with numerical values. The processor-executable instructions, on
execution, may further cause the processor to identify the PMP based on the
aggregate confidence score vector and the set of manufacturing parameters, using a
second ANN model. The second ANN model may capture non-linear dependencies
of identification of the PMP.
Docket No: IIP-HCL-P0050-IN1
-5-
[007] It is to be understood that both the foregoing general description and
the following detailed description are exemplary and explanatory only and are not
restrictive of the invention, as claimed.
BRIEF DESCRIPTION OF THE DRAWINGS
[008] The present application can be best understood by reference to the
following description taken in conjunction with the accompanying drawing figures,
in which like parts may be referred to by like numerals
[009] FIG. 1 illustrates a block diagram of an exemplary system in a
network environment for automatic identification of Primary Manufacturing
Process (PMP) from a three-dimensional (3D) model of a product, in accordance
with some embodiments of the present disclosure.
[010] FIG. 2 illustrates a functional block diagram of an exemplary PMP
identification device, in accordance with some embodiments of the present
disclosure.
[011] FIGS. 3A and 3B illustrate an exemplary scenario for generating a
plurality of images using an image generator, in accordance with some
embodiments of the present disclosure.
[012] FIGS. 4A and 4B illustrate exemplary tables representing matrices
for determining an aggregate confidence score vector from a plurality of confidence
score vectors, in accordance with some embodiments of the present disclosure.
[013] FIGS. 5A and 5B illustrate an exemplary system for determining a
plurality of confidence score vectors, in accordance with some embodiments of the
present disclosure.
[014] FIG. 6 illustrates a second ANN model configured to identify the
PMP based on an aggregate confidence score vector and a vector corresponding to
manufacturing parameters, in accordance with some embodiments of the present
disclosure.
[015] FIG. 7 illustrates a flow diagram of an exemplary process for
automatic identification of a primary manufacturing process (PMP) from a three-
Docket No: IIP-HCL-P0050-IN1
-6-
dimensional (3D) model of a product, in accordance with some embodiments of the
present disclosure.
DETAILED DESCRIPTION OF THE DRAWINGS
[016] The following description is presented to enable a person of ordinary
skill in the art to make and use the invention and is provided in the context of
particular applications and their requirements. Various modifications to the
embodiments will be readily apparent to those skilled in the art, and the generic
principles defined herein may be applied to other embodiments and applications
without departing from the spirit and scope of the invention. Moreover, in the
following description, numerous details are set forth for the purpose of explanation.
However, one of ordinary skill in the art will realize that the invention might be
practiced without the use of these specific details. In other instances, well-known
structures and devices are shown in block diagram form in order not to obscure the
description of the invention with unnecessary detail. Thus, the present invention is
not intended to be limited to the embodiments shown, but is to be accorded the
widest scope consistent with the principles and features disclosed herein.
[017] While the invention is described in terms of particular examples and
illustrative figures, those of ordinary skill in the art will recognize that the invention
is not limited to the examples or figures described. Those skilled in the art will
recognize that the operations of the various embodiments may be implemented
using hardware, software, firmware, or combinations thereof, as appropriate. For
example, some processes can be carried out using processors or other digital
circuitry under the control of software, firmware, or hard-wired logic. (The term
βlogicβ herein refers to fixed hardware, programmable logic and/or an appropriate
combination thereof, as would be recognized by one skilled in the art to carry out
the recited functions.) Software and firmware can be stored on computer-readable
storage media. Some other processes can be implemented using analog circuitry, as
is well known to one of ordinary skill in the art. Additionally, memory or other
Docket No: IIP-HCL-P0050-IN1
-7-
storage, as well as communication components, may be employed in embodiments
of the invention.
[018] Referring now to FIG. 1, a block diagram of a system 100 for
automatic identification of Primary Manufacturing Process (PMP) from a threedimensional (3D) model of a product is illustrated, in accordance with some
embodiments of the present disclosure. In an embodiment, a PMP identification
device 102 may consider critical manufacturing parameters such as, material of the
product, a production volume, a tolerance value, and a surface finish, thereby
eliminates the aforementioned problems. The PMP identification device 102 may
use an annotated 3D model of the product, and product and manufacturing
information (PMI) to predict type/category of the PMP associated with the product.
By way of an example, the PMP identification device 102 may predict the
type/category from a plurality of predefined categories (e.g., a casting process, a
moulding process, a turning process, a milling process, a sheet metal process, a
tubing process, and a rolling process). Further, the PMP identification device 102
may employ a plurality of Artificial Neural Network (ANN) models. For example,
in some embodiments, the PMP identification device 102 may include a first ANN
model which may extract visual features and determine a plurality of confidence
score vectors, based on a plurality of images of the product. In some other
embodiments, the PMP identification device 102 may include a second ANN model
along with the first ANN model. The second ANN model may be used to identify
the PMP based on the aggregate confidence score vector and the set of
manufacturing parameters.
[019] Examples of the PMP identification device 102 may include, but are
not limited to, a server, a desktop, a laptop, a notebook, a tablet, a smartphone, a
mobile phone, an application server, or the like. The PMP identification device 102
may include a memory 104, a processor 106, and a display 108. The display 108
may further include a user interface 110. A user, or an administrator may interact
with the PMP identification device 102 and vice versa through the user interface
110. By way of an example, the display 108 may be used to display results of
analysis performed by the PMP identification device 102, to the user. By way of
Docket No: IIP-HCL-P0050-IN1
-8-
another example, the user interface 110 may be used by the user to provide inputs
to the PMP identification device 102. Further, for example, in some embodiments,
the PMP identification device 102 may render results to the user/administrator via
the user interface 110.
[020] The memory 104 and the processor 106 of the PMP identification
device 102 may perform various functions including, but not limited to, generating
a plurality of images, determining confidence scores, extracting features from
images, extracting manufacturing information, concatenating vectors, and
identifying the PMP. The memory 104 may store instructions that, when executed
by the processor 106, cause the processor 106 to identify the PMP automatically, in
accordance with some embodiments of the present invention. In accordance with
an embodiment, the memory 104 may also store various data (e.g., 3D model of the
product, image dataset, generated matrices, manufacturing information etc.) that
may be captured, processed, generated, and/or required by the PMP identification
device 102. The memory 104 may be a non-volatile memory (e.g., flash memory,
Read Only Memory (ROM), Programmable ROM (PROM), Erasable PROM
(EPROM), Electrically EPROM (EEPROM) memory, etc.) or a volatile memory
(e.g., Dynamic Random Access Memory (DRAM), Static Random-Access memory
(SRAM), etc.).
[021] In order to identify the PMP, the PMP identification device 102 may
acquire information (e.g., 3D model of the product including an annotated 3D model
and PMI) from a server 112. Further, the server 112 may include a database 114. In
some embodiments, the PMP identification device 102 may interact with the user
or administrator via external devices 116 over a communication network 118. In
such embodiments, the PMP identification device 102 may render the results to the
user/administrator via the user interface 110 over the external devices 116. For
example, the user or administrator may get generated results over the external
devices 116. The one or more external devices 116 may include, but not limited to,
a desktop, a laptop, a notebook, a netbook, a tablet, a smartphone, a remote server,
a mobile phone, or another computing system/device. The communication network
118 may be any wired or wireless communication network and the examples may
Docket No: IIP-HCL-P0050-IN1
-9-
include, but may be not limited to, the Internet, Wireless Local Area Network
(WLAN), Wi-Fi, Long Term Evolution (LTE), Worldwide Interoperability for
Microwave Access (WiMAX), and General Packet Radio Service (GPRS).
[022] Further, the PMP identification device 102 may interact with the
external devices 116 and/or the server 112 for sending/receiving various data, via
the communication network 118. The database 114 may store intermediate results
generated by the PMP identification device 102. In accordance with an
embodiment, the server 112 may be communicatively coupled to the database 114,
via the communication network 118 (not shown in FIG. 1).
[023] Referring now to FIG. 2, a functional block diagram of an exemplary
PMP identification device 200 (similar to the PMP identification device 102) is
illustrated, in accordance with some embodiments of the present disclosure. FIG. 2
is explained in conjunction with FIG. 1. The PMP identification device 200 may be
configured for automatically identifying the PMP based on a 3D model 202 of a
product. In some embodiments, the 3D model 202 of the product may be a 3D
Computer Aided Design (CAD) model of the product. Further, the 3D model 202
may include an annotated 3D model with product and manufacturing information
(PMI).
[024] The PMP identification device 200 may perform various operations
to identify the PMP. Further, to perform various operations, the PMP identification
device 200 may include an image generator 204, a first ANN model 206, a vector
aggregator 208, a PMI extractor 210, a vector generator 212, a concatenating
module 214a, and a second ANN model 214. Additionally, the PMP identification
device 200 may also include a data store (not shown in FIG. 2) to store various data
and intermediate results generated by the modules 204-214.
[025] The image generator 204 may be configured to receive the 3D model
of the product 202. The image generator 204 may generate a plurality of images
based on the 3D model 202 of the product. It should be noted that the plurality of
images may be generated corresponding to a plurality of views of the product. In
some embodiments, the image generator 204 may determine an axis of rotation to
capture multiple perspectives of the 3D model 202. Further, the 3D model 202 may
Docket No: IIP-HCL-P0050-IN1
-10-
be rotated at a predefined step angle along the axis of rotation to obtain the plurality
of views of the product. In an embodiment, the axis of rotation may be an axis of
the 3D coordinate system. In some other embodiment, the axis of rotation may be a
component axis determined using principal component analysis (PCA). This may
be further explained in greater detail in conjunction with FIGS. 3A, and 3B. Further,
the image generator 204 may be communicatively coupled to the first ANN model
206 to transmit the plurality of images.
[026] The first ANN model 206 may receive the plurality of images from
the image generator 204. Further, the first ANN model 206 may be configured to
determine a plurality of confidence score vectors based on the plurality of images.
In some embodiments, the first ANN model 206 may be built and trained using a
Convolutional Neural Network (CNN) Model. Alternatively, in some
embodiments, the first ANN model may be implemented using a trained
Convolutional Neural Network (CNN) Model and a transfer learning technique. In
such embodiments, the trained CNN model may be utilized to perform a similar
task. Thus, the trained model may be utilized to perform a new task, which is similar
in nature, using a transfer learning technique. In order to determine the plurality of
confidence score vectors, the first ANN model 206 may extract visual features of
the product from the plurality of images. In other words, the ANN model 206 may
capture complexity of the product by extracting a plurality of visual features of the
product from the plurality of images. It should be noted that each of the plurality of
confidence score vectors may correspond to a plurality of pre-defined PMP
types/categories. The plurality of predefined PMP types/categories may include, but
are not limited to, a casting process, a moulding process, a turning process, a milling
process, a sheet metal process, a tubing process, and a rolling process. The first
ANN model 206 may include convolution layers, pooling layers, and a fully
connected dense layer. The first ANN model 206 may be explained further in
conjunction with FIGS. 5A and 5B. Further, the first ANN model 206 may be
operatively connected to the vector aggregator 208.
[027] The vector aggregator 208 may receive the plurality of confidence
score vectors determined by the first ANN model 206. Further, the vector
Docket No: IIP-HCL-P0050-IN1
-11-
aggregator 208 may be configured to determine an aggregate confidence score
vector based on the plurality of confidence score vectors. In other words, the vector
aggregator 208 may aggregate the plurality of confidence score vectors to a single
vector of confidence scores. The aggregate confidence score vector may represent
a pre-defined PMP category or a PMP type with maximum frequency. Further, the
vector aggregator 208 may be communicatively connected to the second ANN
model 214 through the concatenating module 214a.
[028] The PMI extractor 210 may extract a set of manufacturing
parameters from the 3D model 202, as the 3D model 202 includes an annotated 3D
model with product and manufacturing information (PMI). The set of
manufacturing parameters may include a first set of parameters with categorical
values and/or a second set of parameters with numerical values. Also, it should be
noted that the set of manufacturing parameters may correspond to a set of
parameters provided in the PMI. The PMI may include, but not limited to, a material
specification of the product, a production volume, a geometric dimension of the
product, a tolerance value, and a surface finish. Further, the PMI extractor 210 may
be coupled to the vector generator 212.
[029] The vector generator 212 may generate a vector corresponding to the
set of manufacturing parameters. In particular, the first set of parameters with
categorical values may be converted into numerical values. It should be noted that
the vector generator 212 may include an encoder (not shown in Fig. 2) to convert
the categorical values into numerical values. For example, in some embodiments,
the encoder may use a One-Hot encoding technique convert categorical values into
numerical values. Further, the numerical values may be normalized to a common
scale using feature scaling algorithm. The vector generator 212 may be
communicatively connected to the second ANN model 214 through the
concatenating module 214a. The concatenating module 214a may be configured to
receive outputs of the vector aggregator 208 and the vector generator 212. Further,
the concatenating module 214a may concatenate the aggregate confidence score
vector and vector corresponding to the set of manufacturing parameters, which may
be further provided as an input to the second ANN model 214.
Docket No: IIP-HCL-P0050-IN1
-12-
[030] The second ANN model 214 may identify the PMP based on the
aggregate confidence score vector, and the vector corresponding to the set of
manufacturing parameters. The second ANN model 214 may be a Multi-Layer
Perceptron (MLP) classifier. The MLP classifier may include an input layer, a set
of hidden layers, and an output layer. Nodes of the set of hidden layers and the
output layer utilize a non-linear activation function. Therefore, the PMP
identification device 200 may also capture complexity of the product and non-linear
dependencies of identification of the PMP. The MLP classifier may be further
explained in greater detail in conjunction with FIG. 6.
[031] It should be noted that the PMP identification device 102, 200 may
be implemented in programmable hardware devices such as programmable gate
arrays, programmable array logic, programmable logic devices, or the like.
Alternatively, the PMP identification device 102, 200 may be implemented in
software for execution by various types of processors. An identified engine/module
of executable code may, for instance, include one or more physical or logical blocks
of computer instructions which may, for instance, be organized as a component,
module, procedure, function, or other construct. Nevertheless, the executables of an
identified engine/module need not be physically located together but may include
disparate instructions stored in different locations which, when joined logically
together, comprise the identified engine/module and achieve the stated purpose of
the identified engine/module. Indeed, an engine or a module of executable code
may be a single instruction, or many instructions, and may even be distributed over
several different code segments, among different applications, and across several
memory devices.
[032] As will be appreciated by one skilled in the art, a variety of processes
may be employed for automatic identification of Primary Manufacturing Process
(PMP) from a three-dimensional (3D) model of a product. For example, the
exemplary system 100 and associated PMP identification device 102 may identify
the PMP, by the process discussed herein. In particular, as will be appreciated by
those of ordinary skill in the art, control logic and/or automated routines for
performing the techniques and steps described herein may be implemented by the
Docket No: IIP-HCL-P0050-IN1
-13-
system 100 and the associated PMP identification device 102 either by hardware,
software, or combinations of hardware and software. For example, suitable code
may be accessed and executed by the one or more processors on the system 100 to
perform some or all of the techniques described herein. Similarly, application
specific integrated circuits (ASICs) configured to perform some or all the processes
described herein may be included in the one or more processors on the system 100.
[033] Referring now to FIG. 3A and 3B, an exemplary scenario for
generating a plurality images 304 using an image generator 300 (similar to the
image generator 204) is illustrated, in accordance with some embodiments of the
present disclosure. The image generator 300 may be responsible for generating the
plurality of images 304 based on a 3D model 302 of a product (same as the 3D
model 202). As explained in FIG. 2, the first ANN model 206 may be employed to
capture visual features based on different views of the product. Therefore, the image
generator 300 may be employed to capture different views of the product and to
generate the plurality of images 304 from the 3D model 302 of the product. In some
embodiments, a training dataset comprising multiple images may be generated and
used train the first ANN model 206. It should be noted that training may be carried
out with different hyper parameters till a satisfactory validation accuracy is attained.
Once the first ANN model 206 behaves satisfactorily, the first ANN model 206 may
be saved on a disk and used as an inference model. In some other, embodiments,
the plurality of images 304 may be provided as an input to the first ANN model 206
while performing a task (i.e., a task related to identification of the PMP). The image
generator 300 may generate a lesser number of images when required for
performing the task as compared to a number of images generated for training the
first ANN model 206.
[034] In some embodiments, an axis of rotation may be determined for
capturing multiple perspectives of the 3D model 302, which may be obtained by
rotating the 3D model 302 about the axis of rotation. Further, in some embodiments,
the axis of rotation may be at least one axis of a standard 3D coordinate system. For
example, the axis of rotation may be at least one of an x-axis, a y-axis, and/or a zaxis. Additionally, in some embodiments, another approach of Principal
Docket No: IIP-HCL-P0050-IN1
-14-
Component Analysis (PCA) may be used to determine component axes. It should
be noted that the 3D model 302 may be rotated along one or more number of axes.
To get various images and for better results, the 3D model 302 may be rotated along
maximum possible axes of rotation. The image generator 300 may then capture
multiple views by rotating the 3D model along each rotation axis computed with at
least one of the aforementioned techniques. Also, the 3D model 302 may be rotated
at a predefined step angle along the axis of rotation to obtain the plurality of views
of the product. Number of generated the plurality of images 304 may be calculated
as per equation 1, given below:
ππ’ππππ ππ πΌπππππ = (360Β° Γ· ππ‘ππ π΄ππππ) Γ ππ’ππππ ππ π
ππ‘ππ‘πππ π΄π₯ππ
Equation (1)
[035] The image generator 300 may generate a large training dataset in the
order of tens of thousands for the trained CNN model (for example, Inception V2)
to be fine-tuned. In one example, consider all three axes (i.e., x-axis, y-axis, and zaxis) of standard coordinate system as axes of rotation and a step angle 10 degrees.
In that case, the number of images generated by the image generator 300 may be
108 per model (i.e. (360o
/10o
) X 3).
[036] In some embodiments, a set of images generated by the image
generator 300 may be cropped to get the plurality of images 304. The set of images
may be preprocessed to crop redundant portion from each of the set of images. For
example, an image 306 of the set of images may be preprocessed. Further, the image
generator 300 may remove redundant part from the preprocessed image 306 and a
final image 308 of the plurality of images 304 corresponding the image 306 may be
generated. This may be performed for each of the set of images. After cropping each
of the set of images, the plurality of images 304 may be transmitted to the first ANN
model 206.
[037] Referring now to FIG. 4A and 4B, exemplary tables 400A, 400B
representing matrices 406 and 408 for determining an aggregate confidence score
vector from a plurality of confidence score vectors are illustrated, in accordance
with some embodiments of the present disclosure. The plurality of confidence score
Docket No: IIP-HCL-P0050-IN1
-15-
vectors generated by the first ANN model 206 may be passed to the vector
aggregator 208, as explained in FIG. 2. The vector aggregator 208 may aggregate
its input vectors (i.e., the plurality of confidence score vectors) and generate a single
aggregate confidence score vector as an output. It should be noted that the aggregate
confidence score vector may represent a pre-defined PMP category with maximum
frequency. The vector aggregator 208 may use an aggregator algorithm to determine
the aggregate confidence score vector. A plurality of assumptions may be made for
determining the aggregate confidence score vector based on the aggregator
algorithm. Such assumptions may include, but may not be limited to, dimension of
the plurality of confidence score vectors being fixed, sum of confidence scores in
each of the plurality of confidence score vectors being equal to 1, and value of
confidence score being in between 0 and 1 (i.e., π£ΰ―ΰ― β [0, 1]).
[038] The tables 400A and 400B include total βdβ number of columns and
βnβ number of rows. Further, the tables 400A and 400B include PMP categories 402
(for example, a first PMP category PMP1, a second PMP category PMP2, a third
PMP category PMP3, and a dth PMP category PMPd,), and the plurality of input
confidence vectors 404 (for example, a first input confidence score vector P1 404a,
a second input confidence score vector P2 404b, and nth input confidence score
vector Pn 404n). Further, the matrix 404 may include various matrix elements βVijβ.
Each element of the matrix 404 represents a confidence score value of for a πth PMP
(i.e., PMPj) in an ith input confidence score vector βπΰ―
β². Here, range of βiβ may be β1β
to βnβ (i.e., i = 1, 2, 3, β¦ n), and for βjβ it may be 1 to βdβ (i.e., j = 1, 2, 3, β¦. d).
[039] Further, in consideration of the aggregator algorithm, the following
mathematical notation may be used:
π = Number of input confidence score vectors,
π = dimension of confidence score vector, which may also be equal to the
number of PMPs,
P = an input confidence score vector, and
πΰ―ΰ― = confidence score value for the πth primary manufacturing process in
i
th input confidence vector
Docket No: IIP-HCL-P0050-IN1
-16-
[040] To determine the aggregate confidence score vector, initially, the
vector aggregator 208 may receive the plurality of input confidence score vectors
including βP1β 404a, βP2β 404b, and βPnβ 404n, where:
Pଡ = [π£ΰ¬΅ΰ¬΅, π£ΰ¬΅ΰ¬Ά, π£ΰ¬΅ΰ¬·, β¦ . π£ΰ¬΅ΰ―],
Pΰ¬Ά = [π£ΰ¬Άΰ¬΅, π£ΰ¬Άΰ¬Ά, π£ΰ¬Άΰ¬·, β¦ . π£ΰ¬Άΰ―], and
Pΰ―‘ = [π£ΰ―‘ଡ, π£ΰ―‘ΰ¬Ά, π£ΰ―‘ΰ¬·, β¦ . π£ΰ―‘ΰ―]
[041] Thereafter, a matrix 406 may be formed, as represented in the table
400A. It should be noted that each row of the matrix 406 represent an input
confidence score vector from the plurality of input confidence score vectors.
Further, the columns represent the confidence score for a particular PMP. In a next
step, for each row of the matrix 406, a maximum confidence score value may be
determined and a new matrix 408 may be generated from the matrix 406. The matrix
elements of the new matrix 408 may be β0β or β1β. Here, the elements with maximum
confidence score may be marked as β1β and remaining elements may be marked as
β0β, as represented in the table 400B and matrix 408. Further, a column wise
addition of the matrix elements may be performed and based on that a PMP category
with a highest value may be considered. The PMP category with highest value may
be denoted by πππΰ―. Further, in some embodiments, a matrix may be created from
the rows in the table 400A, where the πππΰ― is equal to 1, in the table 400B. An
average of each column of the created matrix may be calculated and an output vector
representing aggregated confidence score (i.e., the aggregate confidence score
vector) for the PMP with maximum frequency may be determined.
[042] Referring now to FIG. 5A and 5B, an exemplary system 500 for
determining a plurality of confidence score vectors is illustrated, in accordance with
some embodiments of the present disclosure. The system 500 includes a CNN
model 500a that may correspond to a trained CNN model. It may be appreciated to
those skilled in the art that a 3D CAD model or the 3D model 202 of the product
may include visual features which may be useful to guess about a manufacturing
process associated with the product. This implies that the visual features or visual
clues may include vital information that may be used for decision making of the
PMP. Therefore, in some embodiments, a deep learning model in computer vision
Docket No: IIP-HCL-P0050-IN1
-17-
may be used to capture specific intuitions of humans. It should be noted that the
deep learning model may be the CNN model 500a. Further, in some embodiments,
the CNN model 500a may be selected and built from existing ANN architectures
including, but not limited to, an Inception model, a VGG model, a GoogleNet
model, a ResNet model. The CNN model 500a may be pre-trained with a huge
dataset (for e.g., ImageNet containing 1.2 million images with 1000 categories).
The CNN model 500a may analyze input images and capture spatial and temporal
dependencies in the input images through an application of relevant filters.
Architecture of the CNN model 500a may include an input layer 502, an output
layer 504, a feature extractor component 506 and a classification component 508,
as illustrated in Fig. 5A.
[043] The feature extractor component 506 may use a combination of
convolution layers (e.g., convolution layers 510a, 510b, and 510n) and pooling
layers (512a, 512b, and 512n). It should be noted that each convolution layer may
be followed by a pooling layer. The feature extractor component 506 may extract
relevant features from images, which may be further passed to the classification
component 508. Further, the classification component 508 may generate output
including confidence score vectors for different target categories (for example, PMP
categories).
[044] In FIG. 5B, the CNN model 500 (i.e., the trained CNN model) may
be used as a fixed feature extractor for a task of interest, and the transfer learning
technique 514 may be utilized to reuse knowledge of one task to perform another
task. Further, using the trained CNN model and the transfer learning technique, a
first ANN model 500b (analogous to the first ANN model 206) may be
implemented. Moreover, upon implementation, layers of the featured extractor
component 506 (i.e., convolution layers and pooling layers) may be finalized (i.e.,
the parameters of the layers are frozen and saved). Further, the classification
component 508 that includes flatten 508a, dense layer 508b, and SoftMax layer
508c, may be fine-tuned. In some embodiments, the first ANN model 500b may
correspond to a Visual Feature Extractor (VFE) Net. The VFE-Net may learn visual
features responsible for identifying the PMP. In some embodiments, an inception
Docket No: IIP-HCL-P0050-IN1
-18-
V2 model may be used as the CNN 500 model. The inception V2 model may be
trained on ImageNet dataset. Further, the implemented first ANN model 500b may
be capable of determining confidence score vectors for different PMP categories
based on input images.
[045] Referring now to FIG. 6, a second ANN model 600 (similar to the
second ANN model 214) configured to identify the PMP based on an aggregate
confidence score vector 602 and a vector corresponding to manufacturing
parameters 604 is illustrated, in accordance with some embodiments of the present
disclosure. In some embodiments, the second ANN model 600 may be a feed
forward ANN model and uses a supervised learning technique (for example,
backpropagation) for training. Further, the second ANN model 600 may include at
least three layers including an input layer, an output layer, and a hidden layer. In
particular, as illustrated in FIG. 6, the second ANN model 600 may include an input
layer 606, a plurality of hidden layers 608, and an output layer 610.
[046] In some embodiments, the aggregate confidence score vector 602
and a vector corresponding to the set of manufacturing parameters 604 may be
concatenated to form an input for the second ANN model 600. In some
embodiments, the second ANN model 600 may correspond to a Multi-Layer
Perceptron (MLP) that may capture non-linearities based on the aggregate
confidence score vector 602 and the vector corresponding to manufacturing
parameters 604. It should be noted that each node, excluding nodes of input layer,
may use a non-linearity activation function. Further, in some embodiments the
second ANN model 600 may correspond to a Manufacturing Process Classifier
(MPC) Net.
[047] In some embodiments, the second ANN model 600 may be trained
to a satisfactory validation loss and then may be deployed to perform a task. With
regards to training the second ANN model 600, a number of confidence score
vectors may be generated for a single 3D-CAD model by inferring VFE-Net on
different batches of images. A huge dataset with different confidence score vectors
may be generated depending on a batch size and a number of images generated for
the 3D-CAD model. Each of the confidence score vectors may then be concatenated
Docket No: IIP-HCL-P0050-IN1
-19-
with the vector corresponding to the manufacturing parameters, and then the VFENet may be trained in a supervised manner to identify the PMP.
[048] Referring now to FIG. 7, an exemplary process for automatic
identification of a primary manufacturing process (PMP) from a three-dimensional
(3D) model of a product is depicted via a flow diagram 700, in accordance with
some embodiments of the present disclosure. Each step of the process may be
performed by a PMP identification device (similar to the PMP identification device
102 and 200). FIG. 7 is explained in conjunction with FIGS. 1-6.
[049] At step 702, a plurality of images corresponding to a plurality of
views of the product may be generated based on the 3D model of the product. The
plurality of images may be generated by an image generator (similar to the image
generator 204 and the image generator 300). Further, the 3D model may include an
annotated 3D model with product and manufacturing information (PMI). It should
be noted that, to obtain the plurality of views of the product, the 3D model may be
rotated at a predefined step angle along an axis of rotation. In some embodiments,
the predefined step angle for rotation may be 10 degrees. Further, the axis of
rotation may be at least one axis of the 3D coordinate system, or a component axis
determined using principal component analysis (PCA).
[050] At step 704, a plurality of confidence score vectors may be
determined based on the plurality of images. Each of the plurality of confidence
score vector may correspond to a plurality of pre-defined PMP categories. The
plurality of predefined PMP categories may be selected from, but not limited to, a
casting process, a moulding process, a turning process, a milling process, a sheet
metal process, a tubing process, and a rolling process. A first Artificial Neural
Network (ANN) model similar to the first ANN model 206 may used to determine
the plurality of confidence score vectors. In some embodiments, a trained
Convolutional Neural Network (CNN) model may be used to determine a plurality
of confidence score vectors. In some embodiments, a plurality of visual features of
the product may be extracted from the plurality of images in order to capture a
complexity of the product.
Docket No: IIP-HCL-P0050-IN1
-20-
[051] At step 706, an aggregate confidence score vector may be
determined. The plurality of confidence score vectors may be considered to
determine the aggregate confidence score vector. The aggregate confidence score
vector may represent a pre-defined PMP category with maximum frequency.
Thereafter, at step 708, a set of manufacturing parameters associated with the
product may be extracted based on the 3D model of the product. The set of
manufacturing parameters may correspond to PMI. The PMI may include, but not
limited to, a material specification of the product, a production volume, a geometric
dimension of the product, a tolerance value, and a surface finish. Further, the set of
manufacturing parameters may include at least one of a first set of parameters with
categorical values and a second set of parameters with numerical values. In some
embodiments, a vector corresponding to the set of manufacturing parameters may
be generated. Further, to generate the vector corresponding to the set of
manufacturing parameters, the first set of parameters with categorical values may
be converted into numerical values using an encoding algorithm. Also, the
numerical values may be normalized to a common scale using feature scaling
algorithm. Additionally, in some other embodiments, the aggregate confidence
score vector and vector corresponding to the set of manufacturing parameters may
be concatenated.
[052] At step 710, the PMP may be identified based on the aggregate
confidence score vector and the set of manufacturing parameters. A second ANN
model similar to the second ANN model 214 (shown in FIG. 2) may used to identify
the PMP. The second ANN model may capture non-linear dependencies of
identification of the PMP. In some embodiments of the present disclosure, the
second ANN model may be referred as Multi-Layer Perceptron (MLP) classifier.
The second ANN model or the MLP classifier may include an input layer, a set of
hidden layers, and an output layer and the output layer utilize a non-linear activation
function. This has been already explained in conjunction with FIG. 6.
[053] Thus, the present disclosure may overcome drawbacks of traditional
systems discussed before. The disclosed method and system in the present
disclosure may extract all the relevant information from a 3D model which includes
Docket No: IIP-HCL-P0050-IN1
-21-
an annotated 3D model with production and manufacturing information. Moreover,
the disclosure captures complexity of the product and manufacturing parameters
associated with the product, and based on that captures non-linearity between these
two factors for decision making. Therefore, the disclosed system and method may
be highly efficient to identify the PMP. The disclosure may reduce iterations in the
design to manufacturing cycle, thereby helps in reducing manufacturing cost.
Further, the disclosure may enable a CAD designer to get a correct design at first
time by adhering to design guidelines defined for the respective manufacturing
process. This in-turn may improve quality of the design and product. Also, the
disclosure may play a critical role in making intelligent DFM tools, by aiding
conventional tools with the information of manufacturing process. Thus, this may
be used to automatically determine applicable design guidelines against which an
analysis is to be performed. Additionally, the disclosure may also have application
in cost analysis, where cost from various related manufacturing processes
(depending on the confidence scores) may be compared to take an optimum
decision.
[054] It will be appreciated that, for clarity purposes, the above description
has described embodiments of the invention with reference to different functional
units and processors. However, it will be apparent that any suitable distribution of
functionality between different functional units, processors or domains may be used
without detracting from the invention. For example, functionality illustrated to be
performed by separate processors or controllers may be performed by the same
processor or controller. Hence, references to specific functional units are only to be
seen as references to suitable means for providing the described functionality, rather
than indicative of a strict logical or physical structure or organization.
[055] Although the present invention has been described in connection
with some embodiments, it is not intended to be limited to the specific form set
forth herein. Rather, the scope of the present invention is limited only by the claims.
Additionally, although a feature may appear to be described in connection with
particular embodiments, one skilled in the art would recognize that various features
of the described embodiments may be combined in accordance with the invention.
Docket No: IIP-HCL-P0050-IN1
-22-
[056] Furthermore, although individually listed, a plurality of means,
elements or process steps may be implemented by, for example, a single unit or
processor. Additionally, although individual features may be included in different
claims, these may possibly be advantageously combined, and the inclusion in
different claims does not imply that a combination of features is not feasible and/or
advantageous. Also, the inclusion of a feature in one category of claims does not
imply a limitation to this category, but rather the feature may be equally applicable
to other claim categories, as appropriate.
CLAIMS
What is claimed is:
1. A method (700) for automatic identification of a primary manufacturing process
(PMP) from a three-dimensional (3D) model (202) of a product, the method (700)
comprising:
generating (702), by a PMP identification device (200), a plurality of
images corresponding to a plurality of views of the product based on the 3D model
(202) of the product, wherein the 3D model (202) is rotated at a predefined step
angle along an axis of rotation to obtain the plurality of views of the product;
determining (704), by the PMP identification device (200), a plurality of
confidence score vectors, based on the plurality of images, using a first Artificial
Neural Network (ANN) model (206), wherein the first ANN model (206) extracts
a plurality of visual features of the product from the plurality of images to capture
a complexity of the product, and wherein each of the plurality of confidence score
vector correspond to a plurality of pre-defined PMP categories;
determining (706), by the PMP identification device (200), an aggregate
confidence score vector, representing a pre-defined PMP category with maximum
frequency, based on the plurality of confidence score vectors;
extracting (708), by the PMP identification device (200), a set of
manufacturing parameters associated with the product, based on the 3D model
(202) of the product, wherein the set of manufacturing parameters comprises at least
one of a first set of parameters with categorical values and a second set of
parameters with numerical values; and
identifying (710), by the PMP identification device (200), the PMP based
on the aggregate confidence score vector and the set of manufacturing parameters,
using a second ANN model (214), wherein the second ANN model (214) captures
non-linear dependencies of identification of the PMP.
2. The method (700) as claimed in claim 1, wherein:
Docket No: IIP-HCL-P0050-IN1
-24-
the 3D model (202) comprises an annotated 3D model with product and
manufacturing information (PMI), wherein the PMI comprises at least one of a
material specification of the product, a production volume, a geometric dimension
of the product, a tolerance value, and a surface finish;
the set of manufacturing parameters correspond to a set of parameters
provided in the PMI; and
the plurality of predefined PMP categories comprises at least one of a
casting process, a moulding process, a turning process, a milling process, a sheet
metal process, a tubing process, and a rolling process.
3. The method (700) as claimed in claim 1, wherein the axis of rotation is one of:
at least one axis of the 3D coordinate system, or a component axes determined using
principal component analysis (PCA).
4. The method (700) as claimed in claim 1, wherein identifying (710) the PMP
comprises:
generating a vector corresponding to the set of manufacturing parameters,
wherein generating comprises converting the first set of parameters with categorical
values into numerical values using an encoding algorithm, and wherein the
numerical values are normalized to a common scale using feature scaling algorithm;
and
concatenating the aggregate confidence score vector and vector
corresponding to the set of manufacturing parameters.
5. The method (700) as claimed in claim 1, wherein:
the first ANN model (206) comprises a set of convolution layers, a set of
corresponding pooling layers, and a fully connected dense layer, wherein each of
the set of convolution layers is followed by each of the set of corresponding pooling
layer; and
the second ANN model (214) is a Multi-Layer Perceptron (MLP) classifier
comprising an input layer (606), a set of hidden layers (608), and an output layer
Docket No: IIP-HCL-P0050-IN1
-25-
(610), wherein nodes of the set of hidden layers (608) and the output layer (610)
utilize a non-linear activation function.
6. A system (100) for automatic identification of a primary manufacturing
process (PMP) from a three-dimensional (3D) model (202) of a product, the
system (100) comprising:
a processor (106); and
a memory (104) communicatively coupled to the processor (106), wherein
the memory (104) stores processor-executable instructions, which, on execution,
cause the processor (106) to:
generate (702) a plurality of images corresponding to a plurality of
views of the product based on the 3D model (202) of the product, wherein
the 3D model (202) is rotated at a predefined step angle along an axis of
rotation to obtain the plurality of views of the product;
determine (704) a plurality of confidence score vectors, based on
the plurality of images, using a first Artificial Neural Network (ANN)
model (206), wherein the first ANN model (206) extracts a plurality of
visual features of the product from the plurality of images to capture a
complexity of the product, and wherein each of the plurality of confidence
score vector correspond to a plurality of pre-defined PMP categories;
determine (706) an aggregate confidence score vector, representing
a pre-defined PMP category with maximum frequency, based on the
plurality of confidence score vectors;
extract (708) a set of manufacturing parameters associated with the
product, based on the 3D model (202) of the product, wherein the set of
manufacturing parameters comprises at least one of a first set of
parameters with categorical values and a second set of parameters with
numerical values; and
identify (710) the PMP based on the aggregate confidence score
vector and the set of manufacturing parameters, using a second ANN
Docket No: IIP-HCL-P0050-IN1
-26-
model (214), wherein the second ANN model (214) captures non-linear
dependencies of identification of the PMP.
7. The system (100) as claimed in claim 6, wherein:
the 3D model (202) comprises an annotated 3D model with product and
manufacturing information (PMI), wherein the PMI comprises at least one of a
material specification of the product, a production volume, a geometric dimension
of the product, a tolerance value and a surface finish;
the set of manufacturing parameters correspond to a set of parameters
provided in the PMI; and
the plurality of predefined PMP categories comprises at least one of a
casting process, a moulding process, a turning process, a milling process, a sheet
metal process, a tubing process, and a rolling process.
8. The system (100) as claimed in claim 6, wherein the axis of rotation is one of:
at least one axis of the 3D coordinate system, or a component axes determined using
principal component analysis (PCA).
9. The system (100) as claimed in claim 6, wherein the processor-executable
instructions, on execution, cause the processor (106) to identify (710) the PMP by:
generating a vector corresponding to the set of manufacturing parameters,
wherein generating comprises converting the first set of parameters with categorical
values into numerical values using an encoding algorithm, and wherein the
numerical values are normalized to a common scale using feature scaling algorithm;
and
concatenating the aggregate confidence score vector and vector
corresponding to the set of manufacturing parameters.
10. The system (100) as claimed in claim 6, wherein:
the first ANN model (206) comprises a set of convolution layers, a set of
corresponding pooling layers, and a fully connected dense layer, wherein each of
Docket No: IIP-HCL-P0050-IN1
-27-
the set of convolution layers is followed by each of the set of corresponding pooling
layer; and
the second ANN model (214) is a Multi-Layer Perceptron (MLP) classifier
comprising an input layer (606), a set of hidden layers (608), and an output layer
(610), wherein nodes of the set of hidden layers (608) and the output layer (610)
utilize a non-linear activation function.
| # | Name | Date |
|---|---|---|
| 1 | 202111011100-FORM 3 [09-02-2024(online)].pdf | 2024-02-09 |
| 1 | 202111011100-STATEMENT OF UNDERTAKING (FORM 3) [16-03-2021(online)].pdf | 2021-03-16 |
| 2 | 202111011100-CLAIMS [30-05-2023(online)].pdf | 2023-05-30 |
| 2 | 202111011100-REQUEST FOR EXAMINATION (FORM-18) [16-03-2021(online)].pdf | 2021-03-16 |
| 3 | 202111011100-REQUEST FOR EARLY PUBLICATION(FORM-9) [16-03-2021(online)].pdf | 2021-03-16 |
| 3 | 202111011100-DRAWING [30-05-2023(online)].pdf | 2023-05-30 |
| 4 | 202111011100-PROOF OF RIGHT [16-03-2021(online)].pdf | 2021-03-16 |
| 4 | 202111011100-FER_SER_REPLY [30-05-2023(online)].pdf | 2023-05-30 |
| 5 | 202111011100-POWER OF AUTHORITY [16-03-2021(online)].pdf | 2021-03-16 |
| 5 | 202111011100-OTHERS [30-05-2023(online)].pdf | 2023-05-30 |
| 6 | 202111011100-FORM-9 [16-03-2021(online)].pdf | 2021-03-16 |
| 6 | 202111011100-FER.pdf | 2023-01-20 |
| 7 | 202111011100-FORM 18 [16-03-2021(online)].pdf | 2021-03-16 |
| 7 | 202111011100-COMPLETE SPECIFICATION [16-03-2021(online)].pdf | 2021-03-16 |
| 8 | 202111011100-FORM 1 [16-03-2021(online)].pdf | 2021-03-16 |
| 8 | 202111011100-DECLARATION OF INVENTORSHIP (FORM 5) [16-03-2021(online)].pdf | 2021-03-16 |
| 9 | 202111011100-DRAWINGS [16-03-2021(online)].pdf | 2021-03-16 |
| 9 | 202111011100-FIGURE OF ABSTRACT [16-03-2021(online)].jpg | 2021-03-16 |
| 10 | 202111011100-DRAWINGS [16-03-2021(online)].pdf | 2021-03-16 |
| 10 | 202111011100-FIGURE OF ABSTRACT [16-03-2021(online)].jpg | 2021-03-16 |
| 11 | 202111011100-DECLARATION OF INVENTORSHIP (FORM 5) [16-03-2021(online)].pdf | 2021-03-16 |
| 11 | 202111011100-FORM 1 [16-03-2021(online)].pdf | 2021-03-16 |
| 12 | 202111011100-COMPLETE SPECIFICATION [16-03-2021(online)].pdf | 2021-03-16 |
| 12 | 202111011100-FORM 18 [16-03-2021(online)].pdf | 2021-03-16 |
| 13 | 202111011100-FER.pdf | 2023-01-20 |
| 13 | 202111011100-FORM-9 [16-03-2021(online)].pdf | 2021-03-16 |
| 14 | 202111011100-OTHERS [30-05-2023(online)].pdf | 2023-05-30 |
| 14 | 202111011100-POWER OF AUTHORITY [16-03-2021(online)].pdf | 2021-03-16 |
| 15 | 202111011100-FER_SER_REPLY [30-05-2023(online)].pdf | 2023-05-30 |
| 15 | 202111011100-PROOF OF RIGHT [16-03-2021(online)].pdf | 2021-03-16 |
| 16 | 202111011100-DRAWING [30-05-2023(online)].pdf | 2023-05-30 |
| 16 | 202111011100-REQUEST FOR EARLY PUBLICATION(FORM-9) [16-03-2021(online)].pdf | 2021-03-16 |
| 17 | 202111011100-CLAIMS [30-05-2023(online)].pdf | 2023-05-30 |
| 17 | 202111011100-REQUEST FOR EXAMINATION (FORM-18) [16-03-2021(online)].pdf | 2021-03-16 |
| 18 | 202111011100-STATEMENT OF UNDERTAKING (FORM 3) [16-03-2021(online)].pdf | 2021-03-16 |
| 18 | 202111011100-FORM 3 [09-02-2024(online)].pdf | 2024-02-09 |
| 1 | SearchHistory(10)E_18-01-2023.pdf |