Sign In to Follow Application
View All Documents & Correspondence

A Controlled Machine Building Block System With Hardware Interconnections And Method Of Training The Same"

Abstract: An Internet of Things (IoT) enabled trainable apparatus (100) is disclosed that comprises a central module (300) configured to have a plurality of first type of connectors (202) to connect with a plurality of peripherals. The plurality of peripherals is configured to have a plurality of second type of connectors (202) that allows connectivity with the central module (300) and the plurality of peripherals comprises at least one of an input/output module (200), a matrix module (400), a proxy module (500) and a power module (600). The plurality of peripherals communicates with the central module through a one or more interfaces. The IoT enabled trainable apparatus communicates with a learning unit (800) via a communicator (700). The learning unit is a machine learning model trained in real-time with respect to real-world problems and once the IoT enabled trainable apparatus is introduced to the same situation for which the learning unit has been trained, it performs activities to solve the real-world problems by receiving instructions from the learning unit.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
16 January 2020
Publication Number
34/2021
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
pooja@innoveintellects.com
Parent Application

Applicants

Elation Edtech Pvt. Ltd.
F20A - 3rd Floor, Malviya Industrial Area Jaipur Rajasthan India

Inventors

1. Akshat Garg
F20A - 3rd Floor, Malviya Industrial Area Jaipur Rajasthan India 302017

Specification

[0001] The present disclosure relates to the technical field of training
electronic products, and more specifically relates to a trainable apparatus and
method of training the same. This application claims the benefit of priority of
Indian provisional application No. 202011001935, filed 16th 5 January, 2020 the
contents of which are herein incorporated by reference.
BACKGROUND OF INVENTION
[0002] Currently, a wide variety of applications of machine learning and
10 Internet of Things (IoT) has been introduced in the market, such as digital
assistants, smart chatbots, smart security systems, video streaming and
recommendations, search predictions, humanoids, driverless cars, etc., and people
are using them in their everyday life. Although, people interact with these
technologies, most people do not know how these technologies work and how to
15 create their own smart machines. All the interactions of these technologies are
limited to passive consumption and data creation. Moreover, people do not know
how machines learn and are afraid of the technology overcoming them in future.
But as the technology advances, building tools need to be accessible to non-experts
in a cost effective manner.
20 [0003] It is an opportunity and a need to create a simple building block
system that introduces people to Internet of Things and enables people to make
their creations intelligent with machine learning in a customized manner.
[0004] In view of the aforementioned problems and objects, the present
invention provides a controlled machine building block system with hardware
25 interconnections and a method of training the building blocks by applying machine
learning that allows the building blocks to learn and act in a trained manner.
OBJECTIVE OF INVENTION
3
[0005] The principal objective of the present invention is to provide a
controlled machine building block system with hardware interconnections (i.e., a
trainable apparatus) and a method of training the building blocks by applying
machine learning that allows the building block to learn and act in a trained
5 manner.
[0006] Another objective of the present invention is to provide a
customizable apparatus.
[0007] Another objective of the present invention is to provide an
educational apparatus.
10
SUMMARY
[0008] Accordingly, herein disclose an electronic machine building block
system/educational toy, a software block system and few of its use cases, which
aims to teach the basics of electronics and IoT and making it smart with Machine
15 Learning, without requiring expertise in either. The modular system consists of preassembled
printed circuit boards (PCB) interconnected with connectors. Each block
performs one or more discrete function (e.g. Power provides required power for the
system, Proximity sensor reads proximity of an obstacle in front of it/ gives a
trigger of an object, etc.) and communicates with the central block (BRAIN) and
20 the blocks can be connected in plurality to create a larger system. Some blocks
provide sensor inputs, some blocks provide output in the form of light, rotation,
sound, etc. All these blocks are electrically connected to the central block
(BRAIN). The central block is a communication block communicating with the
software terminal. When the software terminal is connected to the communication
25 block, the system is in an online state, if it is disconnected, the system is in offline
state.
[0009] In an aspect, an Internet of Things (IoT) enabled trainable apparatus
is disclosed. The (IoT) enabled trainable apparatus comprises a central module
configured to have a plurality of first type of connectors to connect with a plurality
4
of peripherals. The plurality of peripherals is configured to have a plurality of
second type of connectors that allows connectivity with the central module. The
plurality of peripherals comprising at least one of an input/output module, a matrix
module, a proxy module and a power module. The plurality of peripherals
communicates with the central module through a one or more 5 interfaces. The
plurality of first type of connectors is a female connector and the plurality of second
type of connectors is a male connector. Alternatively, the plurality of first type of
connectors is a male connector and the plurality of second type of connectors is a
female connector.
10 [0010] The input/output module consists of a plurality of sockets to connect
a plurality of components that represents some behaviour or action. The matrix
module acts as an output device representing output via a plurality of light emitting
diodes arranged in a row and column orientation. The proxy module is configured
to sense objects and/or events in a particular range in its surroundings and the
15 power module is configured as a switching module to drive the central module.
[0011] The IoT enabled trainable apparatus is configured to communicate
with a learning unit via a communicator. The learning unit is a machine learning
model that is trained in real-time with respect to real-world problems (or events or
situations) and once the IoT enabled trainable apparatus is introduced to the same
20 situation for which the learning unit has been trained, the IoT enabled trainable
apparatus performs activities to solve the real-world problems by receiving
instructions from the learning unit.
[0012] The learning unit is trained at a user’s end by providing 10-20
examples to solve the real-world problems. The learning unit is trained using image
25 data, text data, or other custom data, where the learning unit performs
categorization and labelling based on the training data to solve the real-world
problems.
[0013] In another aspect, an Internet of Things (IoT) enabled trainable
system is disclosed that comprises a trainable apparatus and a learning unit. The
5
trainable apparatus comprises a central module configured to have a plurality of
first type of connectors to connect with a plurality of peripherals, wherein the
plurality of peripherals is configured to have a plurality of second type of
connectors that allows connectivity with the central module and the plurality of
peripherals comprising at least one of an input/output module, 5 a matrix module, a
proxy module and a power module. The plurality of peripherals communicates with
the central module through a one or more interfaces. The learning unit is
communicatively coupled with the trainable apparatus, wherein the learning unit is
a machine learning model that is trained in real-time with respect to real-world
10 events and once the trainable apparatus is introduced to same event for which the
learning unit has been trained, the trainable apparatus performs activities to solve
the real-world event by receiving instructions from the learning unit.
[0014] These and other aspects of the embodiments herein will be better
appreciated and understood when considered in conjunction with the following
15 description and the accompanying drawings. It should be understood, however, that
the following descriptions, while indicating preferred embodiments and numerous
specific details thereof, are given by way of illustration and not of limitation. Many
changes and modifications may be made within the scope of the embodiments
herein without departing from the spirit thereof, and the embodiments herein
20 include all such modifications.
BRIEF DESCRIPTION OF FIGURES
[0015] The present disclosure is illustrated in the accompanying drawings,
throughout which like reference letters indicate corresponding parts in the various
25 figures. The embodiments herein will be better understood from the following
description with reference to the drawings, in which:
[0016] FIG. 1 illustrates a block diagram of a trainable apparatus.
[0017] FIG. 1A illustrates a system to train the trainable apparatus by
communicating with a learning unit.
6
[0018] FIG. 2 illustrates an isometric view of an example of the trainable
apparatus.
[0019] FIG. 3 is a flow chart illustrating a method to provide the trainable
apparatus.
5
DETAILED DESCRIPTION OF INVENTION
[0020] In the following detailed description of embodiments of the
invention, numerous specific details are set forth in order to provide a thorough
understanding of the embodiment of invention. However, it will be obvious to a
10 person skilled in the art that the embodiments of the invention may be practiced
with or without these specific details. In other instances, well known methods,
procedures and components have not been described in details so as not to
unnecessarily obscure aspects of the embodiments of the invention.
[0021] Furthermore, it will be clear that the invention is not limited to these
15 embodiments only. Numerous modifications, changes, variations, substitutions and
equivalents will be apparent to those skilled in the art, without parting from the
scope of the invention.
[0022] Conditional language used herein, such as, among others, "can,"
"may," "might," "may," “e.g.,” and the like, unless specifically stated otherwise, or
20 otherwise understood within the context as used, is generally intended to convey
that certain embodiments include, while other embodiments do not include, certain
features, elements and/or steps. Thus, such conditional language is not generally
intended to imply that features, elements and/or steps are in any way required for
one or more embodiments or that one or more embodiments necessarily include
25 logic for deciding, with or without other input or prompting, whether these features,
elements and/or steps are included or are to be performed in any particular
embodiment. The terms “comprising,” “including,” “having,” and the like are
synonymous and are used inclusively, in an open-ended fashion, and do not exclude
additional elements, features, acts, operations, and so forth. Also, the term “or” is
7
used in its inclusive sense (and not in its exclusive sense) so that when used, for
example, to connect a list of elements, the term “or” means one, some, or all of the
elements in the list.
[0023] Disjunctive language such as the phrase “at least one of X, Y, Z,”
unless specifically stated otherwise, is otherwise understood with 5 the context as
used in general to present that an item, term, etc., may be either X, Y, or Z, or any
combination thereof (e.g., X, Y, and/or Z). Thus, such disjunctive language is not
generally intended to, and should not, imply that certain embodiments require at
least one of X, at least one of Y, or at least one of Z to each be present.
10 [0024] The accompanying drawings are used to help easily understand
various technical features and it should be understood that the embodiments
presented herein are not limited by the accompanying drawings. As such, the
present disclosure should be construed to extend to any alterations, equivalents and
substitutes in addition to those which are particularly set out in the accompanying
15 drawings. Although the terms first, second, etc. may be used herein to describe
various elements, these elements should not be limited by these terms. These terms
are generally only used to distinguish one element from another.
[0025] The present invention enables users to build electronics, Internet of
Things and create their own machine learning models, integrate machine learning
20 models, electronics and Internet of Things together to make the electronics
intelligent with ease. With the help of the present invention, the users can also
create or fine-tune a machine learning model, train/retrain the machine learning
model with their own data, which can be in the form of images, text, numbers and
combination of data types. Further, the user can integrate this model with the
25 electronics/IoT setup with the developed pluggable electronic modules (with plug
and receptacle connectors) with ease of creating electronics and IoT applications
and integration with software for remote access and control.
[0026] Accordingly, herein discloses an electronic machine building block
system/educational apparatus, a software block system and few of its use cases,
8
which aims to teach the basics of electronics and IoT and making it smart with
machine learning, without requiring expertise in either. The system is modular and
trainable and consists of a printed circuit boards (PCB) interconnected with
connectors. Each block performs one or more discrete function (e.g. Power
provides required power for the system, Proximity sensor reads 5 proximity of an
obstacle in front of it/ gives a trigger of an object, etc.) and communicates with the
central block (BRAIN) and the blocks can be connected in plurality to create a
larger system. Some blocks provide sensor inputs, some blocks provide output in
the form of light, rotation, sound, etc. All these blocks are electrically connected to
10 the central block (BRAIN). The central block is a communication block
communicating with the software terminal. When the software terminal is
connected to the communication block, the system is in an online state, if it is
disconnected, the system is in offline state. Further, a power block, input blocks,
output blocks and communication blocks may or may not have micro-controller
15 units as per requirement of modules. Communication between input, output and
communication module is in digital form. The power, input, output blocks have a
male and a female connectors which are to be connected to the system to implement
electrical connection with the system. The BRAIN has 4 female connectors labelled
for different blocks to be connected into them via male connectors. The blocks can
20 be electrically connected in series via connectors and also with connector extension
cables.
[0027] The present invention provides machine learning model creation,
training and deployment on electronic setup. The users can train the model with a
less number of examples (10-20) on a simplified user interface. The users can train
25 examples with their own created examples in an interactive manner. For example -
for training images, user can capture/upload images for the particular class and train
the model. For language based models, the user can give voice based inputs/ type
text to give training examples. In case of multiple features models, the user can
input their data in the form of training examples by voice/typing/capture and also
9
upload dataset for model training. The specified ML model can be deployed on the
electronics/IoT setup with joining the decision matrix.
[0028] Additionally, the present invention provides learning through play
concept in machine learning domain. The user gets a gamified interface that
increases user engagement and learning. The experiments are designed 5 in a storytelling
based structure, where the user understands the problem and starts a journey
to solve the problem with Internet of Things and machine learning.
[0029] Advantageously, the users can create custom machine learning
models without having any technical knowledge.
10 [0030] Referring now to the drawings, and more particularly to FIGS. 1
through 3, there are shown preferred embodiments.
[0031] FIG. 1 illustrates a block diagram of a trainable apparatus (100).
FIG. 1A illustrates a system to train the trainable apparatus by communicating with
a learning unit. FIG. 2 illustrates an isometric view of an example of the trainable
15 apparatus (100). The trainable apparatus (100) comprises an input/output module or
a first module (200), a central module (300), a matrix module or a second module
(400), a proxy module or a third module (500), a power module or a fourth module
(600) and a communicator (700). The input/output module or the first module
(200), the matrix module or the second module (400), the proxy module or the third
20 module (500), the power module or the fourth module (600) may be termed as a
plurality of peripherals.
[0032] The central module (300) is a main controller (micro-controller) or
BRAIN that manages all the connected peripherals like the input/output module
(200), the matrix module (400), the proxy module (500) and the power module
25 (600). The central module (300) may be communicatively coupled with the learning
unit (800) available at another device or a remote device (900) as shown in the
system (100a) in FIG. 1A. The remote device (900) may be a mobile device, a
smartphone, a PDA, a laptop, a personal computer or the like and the learning unit
may be a platform or an application. The central module (300) may be enabled with
10
wired or wireless communication techniques such as WI-FI, LI-FI, Bluetooth, Near
field communication, LAN or the like. In an implementation, the central module
(300) is enabled with wireless communication. The central module (300) may be
connected with the learning unit available at the remote device via the
communicator (700) enabled with above-mentioned communication 5 techniques.
Alternatively, the central module (300) may have in-built learning unit.
[0033] The learning unit (800) is accessible by a user. The learning unit is
configured to implement machine learning methodology at the user’s end. The
machine learning method uses networks capable of learning in an unsupervised
10 fashion or in a supervised fashion from data that is unstructured (or unlabelled) or
labelled respectively. The machine learning employs multiple layers of neural
networks that enable the platform of the present invention to teach itself through
inference and pattern recognition, rather than development of procedural code or
explicitly coded software algorithms. The neural networks are modelled according
15 to the neuronal structure of a mammal's cerebral cortex, wherein neurons
represented as nodes and synapses represented as uniquely weighted paths or
"tolled roads" between the nodes. The nodes are then organized into layers to
comprise a network. The neural networks are organized in a layered fashion that
includes an input layer, intermediate or hidden layers, and an output layer. The
20 neural networks enhance their learning capability by varying the uniquely weighted
paths based on their received input. The successive layers within the neural network
incorporates a learning capability by modifying their weighted coefficients based
on their received input patterns. The training of the neural networks is very similar
to how we teach children to recognize an object. The neural network is repetitively
25 trained from a base data set, where results from the output layer are successively
compared to the correct classification of the image. Similarly, in the present
invention, a training data set is developed from the user provided real-time inputs.
In other words, the central module (300) is trained at the user’s end to make the
trainable apparatus (100) customizable based on the type of training provided.
11
[0034] Alternatively, any machine learning paradigm instead of neural
networks can be used in the training and learning process of the trainable apparatus
(100).
[0035] The learning unit may be trained with a pair of Labels(y) and
examples(x) known as features(f(5 x,y)) of the model.
[0036] The learning unit is an interactive, intuitive and learning based
platform that includes story-telling structure to describe the problem's environment,
requirements of AI/ML IoT as a solution. The learning unit may be a gamified
interface that acts as bridge between the user and the trainable apparatus (100)
10 which offers methods by which the user can build, deploy and use machine learning
models even without having prior knowledge of AI/ML/IOT with the help of story
based interactions.
[0037] In an illustration, the user creates a real-time scenario. For example,
the user wishes to create a smart dustbin. In order to achieve this, the user may train
15 the learning unit first by using some examples as mentioned above and based on the
training, the central module (300) acts. The user (or end-user) can train the trainable
apparatus (100) (i.e. the central module (300)) with very less number of examples
(10-20) on a simplified user interface of the learning unit. The user can train the
learning unit with their own created examples in an interactive manner. For
20 example - for training images, user can capture/upload images for the particular
class and train the model. For language based models, the user can give voice based
inputs/ type text to give training examples. In case of multiple features models, the
user can input their data in the form of training examples by voice/typing/capture
and also upload dataset for model training. The specified machine learning model
25 can be deployed on the electronics/IoT setup with joining a decision matrix.
Typically, a decision matrix is a list of values in rows and columns that allows a
user to systematically identify, analyze and rate the performance of relationships
between sets of values and information. The elements of the decision matrix show
decisions based on certain criteria such as decision criteria.
12
[0038] The working of the learning unit is explained in more detail using an
example scenario. In an illustration, consider every real world problem (or event or
situation) as an Experiment. This experiment is introduced to the user with a
storyline, and expected to find a solution. The learning unit may utilize natural
language processing, image based classification and mix 5 data type machine
learning model. The user inputs the data which is required to train the machine
learning model, thus the learning unit. After completion of the training, metrics of
the performance of the created and trained machine learning model is available for
the user to asses/retrain or test and integrate with solution of the problem domain.
10 Testing and integration of the created machine learning model is the real case based
testing and integration with the solution. The user can assess the accuracy by testing
it with the real case scenario inputs.
[0039] For learning and training purposes, the user is given a real world
problem (EXPERIMENT) of which user has to find out a solution with the
15 specified technology in the problem definition. In other words, every real world
problem is termed as an Experiment. This experiment is introduced to the user with
a storyline, and expected to find a solution. Based on a relevant solution, the user
may do any of the followings:
[0040] Image data training - User captures and upload images for the
20 corresponding class, satisfies minimum number of examples required and trains the
model. The user can also upload corresponding images from image search online
and also upload larger datasets provided by us on the cloud server.
[0041] Text data training - User give voice inputs/ type text to create new
examples corresponding to class, satisfies minimum number of examples required
25 and trains the model. The user can also upload larger datasets allowed by us on the
cloud server.
[0042] Custom data type training - User defines the custom data type
(combination of images, numbers, text etc.) for the ML model, Trains the model by
inputting examples according to the data type, satisfies minimum number of
13
examples required and trains the model. The user can also upload larger datasets
allowed by us on the cloud server.
[0043] The user, after training all the classes, tests the ML model for
accuracy. The user inputs a new example to be tested and the model returns the
predicted class and model confidence on predicting the particular 5 class. The user
can re-train the model based on their satisfaction with the accuracy levels. After
completion user is given gamification (experience) points based on performance of
user.
[0044] Further, the user integrates the output of the learning unit with tasks
10 of electronic/IoT system by joining the decision matrix. Any data that has to be
taken from the electronic setup in real time is pipelined here and sampled as per
requirement of model. After deployment of the learning unit, the user places the
trainable apparatus (100) and interacts with the experimental setup. The user can
ask peers to interact with the trainable apparatus (100) and play with their creation.
15 After a defined number of tests, the experimental setup is tested with a predefined
dataset, the performance of the experimental setup is calculated based on the
accuracy on the dataset and decides the success/failure of the experiment. In case of
failure, the user has to re-train the model with more number of examples removing
any issues. After completion user is given gamification (experience) points based
20 on performance of user.
[0045] In an implementation, based on the performance of the user in the
experiment and machine learning model, points are given to the user in their profile
created in the learning unit. Based on the experience points earned by the user, a
leader-board is there on the interface which shows standings of different users. The
25 leader-board has multiple standings based on overall performance and experiment
based performance. The user can challenge another user to compete with their
trained machine learning model on and gain more experience points.
[0046] In this way, the central module (300) has the capability of Internet of
Things (IoT) as well as machine learning. Once the learning unit (800) is trained
14
with specific examples, the central module (300) may implement the task in realworld
and may interact with the input/output module (200), the matrix module
(400), the proxy module (500) and the power module (600) over one or more
interfaces such as serial peripheral interface (SPI), I2C or universal serial bus
interfaces to perform the task that has been learnt by the learning 5 unit. The one or
more interfaces are already known to a person skilled in the art and explained
below with respect to each of the input/output module (200), the matrix module
(400), the proxy module (500) and the power module (600).
[0047] The central module (300) may be programmable with an external
10 USB/UART programmer with specific connections and may be configured with to
have a firmware.
[0048] The central module (300), the input/output module (200), the matrix
module (400), the proxy module (500) and the power module (600) may act as plug
and play modules and may be various types of microcontrollers. Further, the central
15 module (300) may be communicatively coupled with the plurality of peripherals
i.e., the input/output module (200), the matrix module (400), the proxy module
(500) and the power module (600) via a coupling or attachment means or physical
communication interfaces such as a plurality of connectors (202). The central
module (300) may have a plurality of first type of connectors (202) and each of the
20 input/output module (200), the matrix module (400), the proxy module (500) and
the power module (600) may have a plurality of second type of connectors (202) in
order to connect to the central module (300). In an example, the plurality of first
type of connectors may be female connectors and the plurality of second type of
connectors may be male connectors. Alternatively, the central module (300) may
25 have the plurality of second type of connectors and each of the input/output module
(200), the matrix module (400), the proxy module (500) and the power module
(600) may have the plurality of first type of connectors in order to connect to the
central module (300). In an example, the plurality of second type of connectors may
be female connectors and the plurality of first type of connectors may be male
15
connectors. Alternatively, the central module (300) and each of the input/output
module (200), the matrix module (400), the proxy module (500) and the power
module (600) may have a combination of the plurality of first type of connectors
and the plurality of second type of connectors. The plurality of first type of
connectors and the plurality of second type of connectors may be 5 four or less than
four or more than four.
[0049] The central module (300) communicates with the input/output (I/O)
module (200) via the one or more interfaces. In an example, the central module
(300) communicates with the input/output module (200) using USB Serial
10 interface. Alternatively, the I2C embedded in the central module (300) may drive
the input/output module (200). The input/output module (200) may include a
plurality of sockets (204) that allows connecting a plurality of components that may
represent some behaviour or action such as, but not limited to, actuators, servo
motor, light emitting diode (LED) strip. The I2C embedded in the central module
15 (300) may drive the plurality of sockets (204). In an example, the LED strip may be
an arrangement of several LEDs (RGB –Red green, Blue) connected in Series and
has ability to light up with variable colour (combination of RGB). Similarly, the
servo motor may be a motor with high torque and low RPM in three pin
combination wired for power and connects to I/O module for various motorised
20 actions.
[0050] In short, the I/O module consists of either input sensor device or
output sensor device or in the combination of both along with the various
supporting electronics circuitry components.
[0051] Alternatively, the input/output module (200) may be configured to
25 have the user interface to communicate with or to provide input/receive output
to/from the central module (300). The input/output module (200) may include a
camera, a microphone, a sensor, a speaker, a display or the like. The input/output
module (200) may capture or record input provided by the user in real-time. The
captured or recorded input is utilized for training the central module (300) at the
16
user’s end. That is, the user of the trainable apparatus (100) may train the trainable
apparatus (100) through the input module (200).
[0052] The matrix module (400) is configured to have LED matrix and may
act as an output device that shows the meaning full data in the form LEDs working
as small pixels. In an example, the matrix module (400) may have 5 a 8x8 matrix i.e.,
64 LEDs connected in row and column orientation. Every led in the matrix may
have individual pixel. The matrix module (400) may be driven by a Max7219 driver
which receives the data(command) from the central module (300) over the SPI
interface. After decoding data, a driver IC7219 shows the data/output by putting
10 corresponding LEDs on and off as required. The matrix module (400) is capable of
showing alphabet/numbers/graphical forms as requirement and feasibility of 8x8
matrix.
[0053] The proxy module (500) may be a sensor powered by optical sensing
technology as small as 1 cm in size. In an example, the sensor may be a proximity
15 sensor. The proxy module (500) may have the capability to sense objects/events
which is (is happening) in a particular range (distance range) in its surrounding. In
an example, sensing distance may be in between 10-15cm. The proxy module (500)
works on the I2C interface and configured by the I2C interface present in the
central module (300).
20 [0054] The power module (600) works as a switching module for providing
power to the central module (300). The power module (600) may receive power
through batteries. Alternatively, the power module (600) may utilize micro USB or
other peripheral that can be connected to any power source like, Laptop’s USB,
Power bank USB, even some regulated DC power provider (LiPO batteries) or the
25 like. The power module (600) regulates the power with constant current and voltage
supply. In an implementation, the power module (600) may have capability to
perform single tap off, single tap on. That is, the power module (600) may have a
switch or a button that may be pressed to turn ON/OFF the central module (300).
17
[0055] Each of the central module (300), the input/output module (200), the
matrix module (400), the proxy module (500) and the power module (600) may
have octagonal (regular and/or irregular) outer shape, integrated with its respective
printed circuit boards and plastic hand holdings. Other shapes are also possible.
Each of the central module (300), the input/output module (200), 5 the matrix module
(400), the proxy module (500) and the power module (600) may indicate whether it
is active/inactive. Each of the central module (300), the input/output module (200),
the matrix module (400), the proxy module (500) and the power module (600) is
designed to have a physical communication interface to interact with each other and
10 may consist of slots for micro-controller/s or input/output sensors as explained
above.
[0056] FIG. 3 is a flow chart (S300) illustrating a method to provide the
trainable apparatus (100). At step S302, the method includes creating machine
learning labels for a real-time situation or experiment in the learning unit. At step
15 S304, the method includes adding features to each label for training the machine
learning model in the learning unit. At step S306, the method includes training the
machine learning model in the learning unit. At step S308, the method includes
testing and saving the trained machine learning model and at step S310, the method
includes communicating the trained machine learning model with the trainable
20 apparatus (100) so that the trainable apparatus (100) may perform the learned task.
[0057] In an illustration, consider an intelligent parking system. Intelligent
parking system is a solution toward modernizing our parking system entrance with
the help of IoT, AI and ML. The parking system will be smart enough to tell us
what kind of vehicle such as two-wheeler, four-wheeler or the like someone wants
25 to park. Based on the learning, the intelligent parking system will direct the
vehicles to their designated location. That is, when a vehicle/pedestrian comes infront
of the Proxy module and in front of the Camera (phone), our system will
understand it (someone has come), bill the vehicle, open the barricade (Servo motor
with I/O Module) and direct it to the right place (showing Animated Forward
18
Arrow on the Matrix module). In order to achieve the above objective, the learning
unit is trained. The user may create the machine learning model with three Labels –
Human, Car, Bike. The user may supply or provide ten or more images to each
label and train the model for achieving good accuracy. Once trained, the trained
machine learning model may predict the vehicle type. This machine 5 learning model
may be stored on a cloud server or in a memory. The trainable apparatus (100),
specifically the central module (300) receives the result/prediction from the created
machine learning model and instructs it’s peripherals accordingly. That is,
whenever a car comes to an entrance, the proxy module (500) senses the presence
10 and takes the picture and the matrix module (400) shows an animated cross to
instructs to stop. The captured picture is fed to the machine learning model for
prediction. If it’s a car, a corresponding gate will open to allow passage to the car.
Similarly, if a bike is predicted, a corresponding gate will open to allow passage to
the bike. Similarly, if a human is predicted, a corresponding gate will open to allow
15 passage to the human. The gate may be opened with the help of servo and forward
arrow on the matrix module (400).
[0058] Advantageously, the present invention provides a system and a
method to build electronics, Internet of Things and create their own machine
learning models, integrate machine learning models, electronics and Internet of
20 Things together to make the system intelligent with ease. The present invention
introduces people to Internet of Things and enables people to make their creations
intelligent with machine learning in a customized manner.
[0059] The various actions, acts, blocks, steps, or the like in the flow
diagram (S300) may be performed in the order presented, in a different order or
25 simultaneously. Further, in some embodiments, some of the actions, acts, blocks,
steps, or the like may be omitted, added, modified, skipped, or the like without
departing from the scope of the invention.
19
[0060] The embodiments disclosed herein can be implemented using at
least one software program running on at least one hardware device and performing
network management functions to control the elements.
[0061] The foregoing description of the specific embodiments will so fully
reveal the general nature of the embodiments herein that others 5 can, by applying
current knowledge, readily modify and/or adapt for various applications such
specific embodiments without departing from the generic concept, and, therefore,
such adaptations and modifications should and are intended to be comprehended
within the meaning and range of equivalents of the disclosed embodiments. It is to
10 be understood that the phraseology or terminology employed herein is for the
purpose of description and not of limitation. Therefore, while the embodiments
herein have been described in terms of preferred embodiments, those skilled in the
art will recognize that the embodiments herein can be practiced with modification
within the spirit and scope of the embodiments as described herein.

We claim:

1. An Internet of Things (IoT) enabled trainable apparatus (100), comprising:
a central module (300) configured to have a plurality of first type of
connectors (202) to connect with a plurality of peripherals, wherein the
plurality of peripherals is configured to have a plurality of second type of
connectors (202) that allows connectivity with the central module (300); and
the plurality of peripherals comprising at least one of an input/output
module (200), a matrix module (400), a proxy module (500) and a power
module (600),
wherein the plurality of peripherals communicates with the central
module (300) through a one or more interfaces.
2. The IoT enabled trainable apparatus (100) as claimed in claim 1, the
plurality of first type of connectors (202) is a female connector and the plurality of
second type of connectors (202) is a male connector.
3. The IoT enabled trainable apparatus (100) as claimed in claim 1, the
plurality of first type of connectors (202) is a male connector and the plurality of
second type of connectors (202) is a female connector.
4. The IoT enabled trainable apparatus (100) as claimed in claim 1, wherein
the input/output module (200) consists of a plurality of sockets to connect a
plurality of components that represents some behaviour or action.
5. The IoT enabled trainable apparatus (100) as claimed in claim 1, wherein
the matrix module (400) acts as an output device representing output via a
plurality of light emitting diodes arranged in a row and column orientation.
21
6. The IoT enabled trainable apparatus (100) as claimed in claim 1, wherein
the proxy module (500) is configured to sense objects and/or events in a particular
range in its surroundings and the power module (600) is configured as a switching
module to drive the central module (300).
7. The IoT enabled trainable apparatus (100) as claimed in claim 1 is
configured to communicate with a learning unit (800) via a communicator (700),
wherein the learning unit (800) is a machine learning model that is trained in realtime
with respect to real-world events and once the IoT enabled trainable
apparatus (100) is introduced to the same event for which the learning unit (800)
has been trained, the IoT enabled trainable apparatus (100) performs activities to
solve the real-world events by receiving instructions from the learning unit (800).
8. The IoT enabled trainable apparatus (100) as claimed in claim 7, wherein
the learning unit (800) is trained at a user’s end by providing 10-20 examples to
solve the real-world problems or events.
9. The IoT enabled trainable apparatus (100) as claimed in claim 7, wherein
the learning unit (800) is trained using image data, text data, or other custom data,
where the learning unit (800) performs categorization and labelling based on the
training data to solve the real-world problems or events.
10. An Internet of Things (IoT) enabled trainable system (100a), comprising:
a trainable apparatus (100), the trainable apparatus comprising:
a central module (300) configured to have a plurality of first
type of connectors (202) to connect with a plurality of peripherals,
wherein the plurality of peripherals is configured to have a plurality
of second type of connectors (202) that allows connectivity with the
central module (300); and
22
the plurality of peripherals comprising at least one of an
input/output module (200), a matrix module (400), a proxy module
(500) and a power module (600),
wherein the plurality of peripherals communicates with the
central module (300) through a one or more interfaces,
a learning unit (800) communicatively coupled with the trainable
apparatus (100), wherein the learning unit (800) is a machine learning model
that is trained in real-time with respect to real-world events and once the
trainable apparatus (100) is introduced to same event for which the learning
unit (800) has been trained, the trainable apparatus (100) performs activities
to solve the real-world event by receiving instructions from the learning unit
(800).

Documents

Application Documents

# Name Date
1 202011001935-CLAIMS [25-08-2022(online)].pdf 2022-08-25
1 202011001935-STATEMENT OF UNDERTAKING (FORM 3) [16-01-2020(online)].pdf 2020-01-16
2 202011001935-FER_SER_REPLY [25-08-2022(online)].pdf 2022-08-25
2 202011001935-PROVISIONAL SPECIFICATION [16-01-2020(online)].pdf 2020-01-16
3 202011001935-POWER OF AUTHORITY [16-01-2020(online)].pdf 2020-01-16
3 202011001935-FER.pdf 2022-05-06
4 202011001935-FORM FOR STARTUP [16-01-2020(online)].pdf 2020-01-16
4 202011001935-FORM 18 [02-11-2021(online)].pdf 2021-11-02
5 202011001935-FORM FOR SMALL ENTITY(FORM-28) [16-01-2020(online)].pdf 2020-01-16
5 202011001935-COMPLETE SPECIFICATION [23-12-2020(online)].pdf 2020-12-23
6 202011001935-FORM 1 [16-01-2020(online)].pdf 2020-01-16
6 202011001935-DRAWING [23-12-2020(online)].pdf 2020-12-23
7 202011001935-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [16-01-2020(online)].pdf 2020-01-16
7 202011001935-ENDORSEMENT BY INVENTORS [23-12-2020(online)].pdf 2020-12-23
8 202011001935-FORM 13 [23-12-2020(online)].pdf 2020-12-23
8 202011001935-EVIDENCE FOR REGISTRATION UNDER SSI [16-01-2020(online)].pdf 2020-01-16
9 202011001935-DRAWINGS [16-01-2020(online)].pdf 2020-01-16
9 202011001935-RELEVANT DOCUMENTS [23-12-2020(online)].pdf 2020-12-23
10 202011001935-OTHERS-050220.pdf 2020-02-06
10 abstract.jpg 2020-01-24
11 202011001935-Power of Attorney-050220.pdf 2020-02-06
12 202011001935-OTHERS-050220.pdf 2020-02-06
12 abstract.jpg 2020-01-24
13 202011001935-DRAWINGS [16-01-2020(online)].pdf 2020-01-16
13 202011001935-RELEVANT DOCUMENTS [23-12-2020(online)].pdf 2020-12-23
14 202011001935-EVIDENCE FOR REGISTRATION UNDER SSI [16-01-2020(online)].pdf 2020-01-16
14 202011001935-FORM 13 [23-12-2020(online)].pdf 2020-12-23
15 202011001935-ENDORSEMENT BY INVENTORS [23-12-2020(online)].pdf 2020-12-23
15 202011001935-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [16-01-2020(online)].pdf 2020-01-16
16 202011001935-DRAWING [23-12-2020(online)].pdf 2020-12-23
16 202011001935-FORM 1 [16-01-2020(online)].pdf 2020-01-16
17 202011001935-COMPLETE SPECIFICATION [23-12-2020(online)].pdf 2020-12-23
17 202011001935-FORM FOR SMALL ENTITY(FORM-28) [16-01-2020(online)].pdf 2020-01-16
18 202011001935-FORM 18 [02-11-2021(online)].pdf 2021-11-02
18 202011001935-FORM FOR STARTUP [16-01-2020(online)].pdf 2020-01-16
19 202011001935-POWER OF AUTHORITY [16-01-2020(online)].pdf 2020-01-16
19 202011001935-FER.pdf 2022-05-06
20 202011001935-PROVISIONAL SPECIFICATION [16-01-2020(online)].pdf 2020-01-16
20 202011001935-FER_SER_REPLY [25-08-2022(online)].pdf 2022-08-25
21 202011001935-STATEMENT OF UNDERTAKING (FORM 3) [16-01-2020(online)].pdf 2020-01-16
21 202011001935-CLAIMS [25-08-2022(online)].pdf 2022-08-25

Search Strategy

1 searchE_05-05-2022.pdf