Sign In to Follow Application
View All Documents & Correspondence

System For Controlling Operation Of Electronic Items Using Actuators

Abstract: According to an aspect, a method for executing configurable tasks includes: generating, by one or more sensors, an input signal pertaining to one or more attributes sensed by the one or more sensors; pre-processing, by one or more processors of a computing engine operatively coupled to the one or more sensors and one or more actuators, the generated input signal to generate a processed signal; classifying, by the one or more processors, the processed signal into at least one of one or more backgrounds by extracting one or more parameters from the processed signal, and comparing the extracted one or more parameters with a dataset comprising a set of prestored parameters associated with corresponding background; and in response to the background classification of the extracted one or more parameters, generating, by the one or more processors, a control signal, where the control signal comprises information pertaining to selecting of a set of actuators of the one or more actuators and a remedial action to be taken, and wherein the selected set of actuators, on receipt of the generated control signal, execute the remedial action corresponding to one or more electronic items operatively coupled to the one or more actuators.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
29 October 2019
Publication Number
50/2019
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
info@khuranaandkhurana.com
Parent Application
Patent Number
Legal Status
Grant Date
2020-09-07
Renewal Date

Applicants

CYRAN AI SOLUTIONS PRIVATE LIMITED
B-40, Kailash Colony, New Delhi - 110048, India.

Inventors

1. SURI, Manan
B-40, Second Floor, Kailash Colony, New Delhi - 110048, India.

Specification

The present disclosure relates to auto-configurable system and method thereof.
More particularly, the present disclosure relates to system and method for updating and executing configurable tasks.
BACKGROUND
[0002] The background description includes information that may be useful in
understanding the present invention. It is not an admission that any of the information provided herein is prior art or relevant to the presently claimed invention, or that any publication specifically or implicitly referenced is prior art.
[0003] Artificial intelligence (AI) is the simulation of human intelligence processes
by machines, especially computer systems. These processes include learning (the acquisition of information and rules for using the information), reasoning (using rules to reach approximate or definite conclusions) and self-correction. Particular applications of AI include expert systems, speech recognition and machine vision. AI can be categorized as either weak or strong. Weak AI, also known as narrow AI, is an AI system that is designed and trained for a particular task. Virtual personal assistants, such as Apple's Siri, are a form of weak AI. Strong AI, also known as artificial general intelligence, is an AI system with generalized human cognitive abilities. When presented with an unfamiliar task, a strong AI system is able to find a solution without human intervention.
[0004] Because hardware, software and staffing costs for AI can be expensive, many
vendors include AI components in their standard offerings, as well as access to Artificial Intelligence as a Service (AlaaS) platforms. AI as a Service allows individuals and companies to experiment with AI for various business purposes and sample multiple platforms before making a commitment. Popular AI cloud offerings include Amazon AI services, IBM Watson Assistant, Microsoft Cognitive Services and Google AI services.
[0005] While AI tools present a range of new functionality for businesses, the use of
artificial intelligence raises ethical questions. This is because deep learning algorithms, which underpin many of the most advanced AI tools, are only as smart as the data they are given in training. Because a human selects what data should be used for training an AI program, the potential for human bias is inherent and must be monitored closely.

[0006] Some industry experts believe that the term artificial intelligence is too closely
linked to popular culture, causing the general public to have unrealistic fears about artificial intelligence and improbable expectations about how it will change the workplace and life in general. Researchers and marketers hope the label augmented intelligence, which has a more neutral connotation, will help people understand that AI will simply improve products and services, not replace the humans that use them.
[0007] However, full benefits of the state-of-the-art technologies are far from the
reach of the masses owing to the fact that the using the Al-based systems and devices generally require a professional to program and/or pre-program the database to enable the devices and the system to be used. This generally leads to wastage of time-periods for programming and testing for various applications. Existing technologies require supervision of the expert for utilising the technology. Therefore, it is very difficult for a layman to understand and carry out experiment, invention and the like without having an expertise in the subject matter.
[0008] There is, therefore, a need in the art to provide systems and methods for
updating and executing configurable tasks that can be used by non-professional users to perform invention, experiments and/or commencing feedback-based tasks and the like which are robust, accurate, fast, efficient, cost-effective and simple.
OBJECTS OF THE PRESENT
[0009] Some of the objects of the present disclosure, which at least one embodiment
herein satisfies are as listed herein below.
[0010] It is an object of the present disclosure to provide system and method for
executing configurable tasks.
[0011] It is another object of the present disclosure to provide system and method that
enables a ready to use a feature, that can enable a non-professional user to perform
experimentation, invention and the like.
[0012] It is another object of the present disclosure to provide system and method that
enables training or updating of the database for enhancing the usability.
[0013] It is another object of the present disclosure to provide system and method for
executing configurable tasks that does not require exchanging data packets of information
with a remote server, thus processing is faster at enhanced security.

[0014] It is another object of the present disclosure to provide system and method for
executing configurable tasks that reduces or prevents cost incurred due to network data
expenditure.
[0015] It is yet another object of the present disclosure to provide system and method
for executing configurable tasks that is cost-effective and easy to implement.
SUMMARY
[0016] The present disclosure relates to auto-configurable system and method thereof.
More particularly, the present disclosure relates to system and method for updating and executing configurable tasks.
[0017] An aspect of the present disclosure provides a method comprising the steps of:
generating, by one or more sensors, an input signal pertaining to one or more attributes sensed by the one or more sensors; pre-processing, by one or more processors of a computing engine operatively coupled to the one or more sensors and one or more actuators, the generated input signal to generate a processed signal; classifying, by the one or more processors, the processed signal into at least one of one or more backgrounds by extracting one or more parameters from the processed signal, and comparing the extracted one or more parameters with a dataset comprising a set of prestored parameters associated with corresponding background; and in response to the background classification of the extracted one or more parameters, generating, by the one or more processors, a control signal, where the control signal comprises information pertaining to selecting of a set of actuators of the one or more actuators and a remedial action to be taken, and wherein the selected set of actuators, on receipt of the generated control signal, execute the remedial action corresponding to one or more electronic items operatively coupled to the one or more actuators.
[0018] In an aspect, the method comprises updating the dataset, said updating the
dataset comprises the steps of: generating, by the one or more sensors, the input signal pertaining to the one or more attributes sensed by the one or more sensors; pre-processing, by the one or more processors, the generated input signal to generate a processed signal; classifying, by the one or more processors, the processed signal into at least one new background by extracting information from a set of data packets received from a computing device associated with a user; and in response to the new background classification, generating, by the one or more processors, a new remedial action based on the processed signal, where the new remedial action comprises information pertaining to selecting of a first

set of actuators of the one or more actuators, and controlling operation of corresponding one or more electronic items using the first set of actuators.
[0019] In an aspect, the pre-processing step comprises any or a combination of
enhancement, resizing, correction and reduction of data information of the input signal.
[0020] In an aspect, each of the one or more sensors comprises any or a combination
of an optical sensor, a microphone, a spectral sensor, a piezoelectric sensor, a temperature sensor, a thermal imaging sensor, an infrared sensor, a near infrared sensor, a pressure sensor and a proximity sensor.
[0021] In an aspect, the one or more actuators comprises a relay, a switch, an
electrical actuator, a mechanical actuator and an electromechanical actuator.
[0022] In an aspect, the remedial action comprises a controlling operation of the one
or more electronic items using the one or more actuators.
[0023] In an aspect, the one or more electronic items comprises a Liquid Crystal
Display, a Light Emitting Diode, a motor, a buzzer, an audio unit and a video unit.
[0024] In an aspect, the method comprises estimating accuracy in executing the
remedial action.
[0025] Yet another aspect of the present disclosure provides a system comprising: one
or more actuators configured with one or more electronic items; one or more sensors configured to generate an input signal pertaining to one or more attributes sensed by the one or more sensors; and a computing engine operatively coupled with the one or more sensors and the one or more actuators, the control unit comprises one or more processors coupled to a memory, the memory storing instructions executable by the one or more processors to: pre-process the generated input signal to generate a processed signal; classify the processed signal into at least one of one or more background by extracting one or more parameters, and comparing the extracted one or more parameters with a dataset comprising a set of prestored parameters associated with corresponding background; and in response to the background classification of the extracted one or more parameters, generate a control signal, where the control signal comprises information pertaining to selecting of a set of actuators of the one or more actuators and a remedial action to be taken; and wherein the selected set of actuators, on receipt of the, generated control signal, execute the remedial action corresponding to the one or more electronic items.
[0026] In an aspect, the system comprises an input unit operatively coupled to the
computing engine to enable interaction of a user with the computing engine, the input unit

comprises a keyboard, a mouse, a stylus, an audio input unit, a visual input unit and a touch enabled display.
BRIEF DESCRIPTION OF THE DRAWINGS
[0027] In the figures, similar components and/or features may have the same
reference label. Further, various components of the same type may be distinguished by
following the reference label with a second label that distinguishes among the similar
components. If only the first reference label is used in the specification, the description is
applicable to any one of the similar components having the same first reference label
irrespective of the second reference label.
[0028] FIG. 1 illustrates an exemplary network architecture in which or with which
proposed system can be implemented in accordance with an embodiment of the present
disclosure.
[0029] FIG. 2 illustrates an exemplary functional diagram for executing pre-
configurable tasks in accordance with an embodiment of the present disclosure.
[0030] FIG. 3 A is a flow diagram illustrating a process for executing pre-configurable
tasks in accordance with an embodiment of the present disclosure.
[0031] FIG. 3B is a flow diagram illustrating a process for training a database in
accordance with an embodiment of the present disclosure.
[0032] FIG. 4 illustrates an exemplary graphical user interface in accordance with an
embodiment of the present disclosure.
[0033] FIG. 5 illustrates an exemplary process of inference mode of operation in
accordance with an embodiment of the present disclosure.
[0034] FIG. 6 illustrates an exemplary process of training mode of operation in
accordance with an embodiment of the present disclosure.
[0035] Fig. 7 illustrates an exemplary training mode in accordance with an
embodiment of the present disclosure.
[0036] FIG. 8 illustrates an exemplary execution of remedial action using an
exemplary actuation pipeline in accordance with an embodiment of the present disclosure.
[0037] FIG. 9 illustrates an exemplary implementation using a camera and a toy car in
accordance with an embodiment of the present disclosure.

DETAILED DESCRIPTION
[0038] The following is a detailed description of embodiments of the disclosure
depicted in the accompanying drawings. The embodiments are in such detail as to clearly communicate the disclosure. However, the amount of detail offered is not intended to limit the anticipated variations of embodiments; on the contrary, the intention is to cover all modifications, equivalents, and alternatives falling within the spirit and scope of the present disclosure as defined by the appended claims.
[0039] In the following description, numerous specific details are set forth in order to
provide a thorough understanding of embodiments of the present invention. It will be apparent to one skilled in the art that embodiments of the present invention may be practised without some of these specific details.
[0040] Embodiments of the present invention include various steps, which will be
described below. The steps may be performed by hardware components or may be embodied in machine-executable instructions, which may be used to cause a general-purpose or special-purpose processor programmed with the instructions to perform the steps. Alternatively, steps may be performed by a combination of hardware, software, and firmware and/or by human operators.
[0041] Various methods described herein may be practiced by combining one or more
machine-readable storage media containing the code according to the present invention with
appropriate standard computer hardware to execute the code contained therein. An apparatus
for practicing various embodiments of the present invention may involve one or more
computers (or one or more processors within a single computer) and storage systems
containing or having network access to computer program(s) coded in accordance with
various methods described herein, and the method steps of the invention could be
accomplished by modules, routines, subroutines, or subparts of a computer program product.
[0042] If the specification states a component or feature "may", "can", "could", or
"might" be included or have a characteristic, that particular component or feature is not required to be included or have the characteristic.
[0043] As used in the description herein and throughout the claims that follow, the
meaning of "a," "an," and "the" includes plural reference unless the context clearly dictates otherwise. Also, as used in the description herein, the meaning of "in" includes "in" and "on" unless the context clearly dictates otherwise.
[0044] Exemplary embodiments will now be described more fully hereinafter with
reference to the accompanying drawings, in which exemplary embodiments are shown. These

exemplary embodiments are provided only for illustrative purposes and so that this disclosure will be thorough and complete and will fully convey the scope of the invention to those of ordinary skill in the art. The invention disclosed may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein. Various modifications will be readily apparent to persons skilled in the art. The general principles defined herein may be applied to other embodiments and applications without departing from the spirit and scope of the invention. Moreover, all statements herein reciting embodiments of the invention, as well as specific examples thereof, are intended to encompass both structural and functional equivalents thereof. Additionally, it is intended that such equivalents include both currently known equivalents as well as equivalents developed in the future (i.e., any elements developed that perform the same function, regardless of structure). Also, the terminology and phraseology used is for the purpose of describing exemplary embodiments and should not be considered limiting. Thus, the present invention is to be accorded the widest scope encompassing numerous alternatives, modifications and equivalents consistent with the principles and features disclosed. For purpose of clarity, details relating to technical material that is known in the technical fields related to the invention have not been described in detail so as not to unnecessarily obscure the present invention.
[0045] Thus, for example, it will be appreciated by those of ordinary skill in the art
that the diagrams, schematics, illustrations, and the like represent conceptual views or processes illustrating systems and methods embodying this invention. The functions of the various elements shown in the figures may be provided through the use of dedicated hardware as well as hardware capable of executing associated software. Similarly, any switches shown in the figures are conceptual only. Their function may be carried out through the operation of program logic, through dedicated logic, through the interaction of program control and dedicated logic, or even manually, the particular technique being selectable by the entity implementing this invention. Those of ordinary skill in the art further understand that the exemplary hardware, software, processes, methods, and/or operating systems described herein are for illustrative purposes and, thus, are not intended to be limited to any particular named element.
[0046] Furthermore, embodiments may be implemented by hardware, software,
firmware, middleware, microcode, hardware description languages, or any combination thereof. When implemented in software, firmware, middleware or microcode, the program code or code segments to perform the necessary tasks (e.g., a computer-program product)

may be stored in a machine-readable medium. A processor(s) may perform the necessary
tasks.
[0047] All methods described herein may be performed in any suitable order unless
otherwise indicated herein or otherwise clearly contradicted by context. The use of any and
all examples, or exemplary language (e.g., "such as") provided with respect to certain
embodiments herein is intended merely to better illuminate the invention and does not pose a
limitation on the scope of the invention otherwise claimed. No language in the specification
should be construed as indicating any non-claimed element essential to the practice of the
invention.
[0048] Various terms as used herein are shown below. To the extent a term used in a
claim is not defined below, it should be given the broadest definition persons in the pertinent
art have given that term as reflected in printed publications and issued patents at the time of
filing.
[0049] The present disclosure relates to auto-configurable system and method thereof.
More particularly, the present disclosure relates to system and method for updating and
executing configurable tasks.
[0050] An aspect of the present disclosure provides a method comprising the steps of:
generating, by one or more sensors, an input signal pertaining to one or more attributes
sensed by the one or more sensors; pre-processing, by one or more processors of a computing
engine operatively coupled to the one or more sensors and one or more actuators, the
generated input signal to generate a processed signal; classifying, by the one or more
processors, the processed signal into at least one of one or more backgrounds by extracting
one or more parameters from the processed signal, and comparing the extracted one or more
parameters with a dataset comprising a set of prestored parameters associated with
corresponding background; and in response to the background classification of the extracted
one or more parameters, generating, by the one or more processors, a control signal, where
the control signal comprises information pertaining to selecting of a set of actuators of the
one or more actuators and a remedial action to be taken, and wherein the selected set of
actuators, on receipt of the generated control signal, execute the remedial action
corresponding to one or more electronic items operatively coupled to the one or more
actuators.
[0051] In an aspect, the method comprises updating the dataset, said updating the
dataset comprises the steps of: generating, by the one or more sensors, the input signal
pertaining to the one or more attributes sensed by the one or more sensors; pre-processing, by

the one or more processors, the generated input signal to generate a processed signal; classifying, by the one or more processors, the processed signal into at least one new background by extracting information from a set of data packets received from a computing device associated with a user; and in response to the new background classification, generating, by the one or more processors, a new remedial action based on the processed signal, where the new remedial action comprises information pertaining to selecting of a first set of actuators of the one or more actuators, and controlling operation of corresponding one or more electronic items using the first set of actuators.
[0052] In an aspect, the pre-processing step comprises any or a combination of
enhancement, resizing, correction and reduction of data information of the input signal.
[0053] In an aspect, each of the one or more sensors comprises any or a combination
of an optical sensor, a microphone, a spectral sensor, a piezoelectric sensor, a temperature sensor, a thermal imaging sensor, an infrared sensor, a near infrared sensor, a pressure sensor and a proximity sensor.
[0054] In an aspect, the one or more actuators comprises a relay, a switch, an
electrical actuator, a mechanical actuator and an electromechanical actuator.
[0055] In an aspect, the remedial action comprises controlling operation of the one or
more electronic items using the one or more actuators.
[0056] In an aspect, the one or more electronic items comprises a Liquid Crystal
Display, a Light Emitting Diode, a motor, a buzzer, an audio unit and a video unit.
[0057] In an aspect, the method comprises estimating accuracy in executing the
remedial action.
[0058] Yet another aspect of the present disclosure provides a system comprising: one
or more actuators configured with one or more electronic items; one or more sensors configured to generate an input signal pertaining to one or more attributes sensed by the one or more sensors; and a computing engine operatively coupled with the one or more sensors and the one or more actuators, the control unit comprises one or more processors coupled to a memory, the memory storing instructions executable by the one or more processors to: pre-process the generated input signal to generate a processed signal; classify the processed signal into at least one of one or more background by extracting one or more parameters, and comparing the extracted one or more parameters with a dataset comprising a set of prestored parameters associated with corresponding background; and in response to the background classification of the extracted one or more parameters, generate a control signal, where the control signal comprises information pertaining to selecting of a set of actuators of the one or

more actuators and a remedial action to be taken; and wherein the selected set of actuators, on receipt of the, generated control signal, execute the remedial action corresponding to the one or more electronic items.
[0059] In an aspect, the system comprises an input unit operatively coupled to the
computing engine to enable interaction of a user with the computing engine, the input unit comprises a keyboard, a mouse, a stylus, an audio input unit, a visual input unit and a touch enabled display.
[0060] FIG. 1 illustrates an exemplary network architecture in which or with which
proposed system can be implemented in accordance with an embodiment of the present disclosure.
[0061] According to an embodiment, the present disclosure provides a system for
executing configurable tasks 100. The system 100 includes a computing engine 102. In the
present embodiment, the computing engine 102can be proprietary electronic hardware where
the processing and implementation of various set of instructions stored in memory take place.
In an embodiment, the computing engine 102 can be a brain equivalent to the system 100.
The computing engine 102 may include but not limiting in any way only to a
microcontroller(s), microprocessor(s), semiconductor on-chip/off-chip memory, Input/output
interfaces such as universal serial bus (USB), inter-integrated circuit (I2C), high definition
multimedia interface (HDMI), serial peripheral interface (SPI), audio jack etc, for interfacing
with external components. The computing engine 102 may also contain Application-Specific
Integrated Circuits (ASIC) specialized for the purpose of executing set of instructions that
may include Artificial Intelligence (AI) algorithms and the like. Some of the ASICs may
include graphic processing units (GPUs),tensor processing units(TPUs),Field-Programmable
Gate Arrays(FPGAs), Al-Inference and AI-Training accelerators etc. The computing engine
102 may also contain memory elements such as static random-access memory (SRAM),
dynamic random- access memory(DRAM), Flash memory, secure digital (SD) Card, micro-
SD card, a Hard Disk drive etc.
[0062] In an embodiment, the system 100 includes one or more sensors 104-1, 104-
2 104-N (collectively referred to as sensors 104 herein, and individually referred to as
sensor 104 herein) operatively coupled to the computing unit 102. The sensors 104 may include cameras, microphones, optical/spectral sensors, piezoelectric, temperature/thermal/IR/NIR sensors, pressure sensors etc. The sensors 104 can be used for sensing various attributes specific to the type of sensor 104 selected to generate an input signal indicative of the sensed various set of attributes by the sensors 104. In one

embodiment, the sensed attributes can be directly transmitted to computing engine 102 for processing. In another embodiment, the sensed attributes can be transmitted to the computing engine 102 after various pre-processing of sensed attributes, in yet another embodiment, the pre-processing can include enhancement or filtering steps like noise reduction, image enhancement etc.
[0063] In an embodiment, the system 100 further includes Printed Circuit Board
(PCB) 108 (interchangeably referred to as interfacing unit 108 herein) is a specially designed proprietary and custom electronic PCB 108 that facilitates interfacing between the computing engine 102, sensors 104, external I/O peripherals 106 (interchangeably referred to as set of electronic devices 106 herein), and the external electro/mechanical actuators 110 (interchangeably referred to as set of actuators 110 herein).
[0064] In an embodiment, the system further includes actuators/peripherals
108,110operatively coupled to the computing engine 102. The actuators/peripherals 108,110 may include devices like keyboard, mouse, stylus, display, LCDs, LEDs, motors, buzzers, relays, switches or any other electro-mechanical actuator that can be controlled using interfaces such as USB, I2C, SPI etc.
[0065] In an embodiment, the system 100 can provide a graphical user interface
(GUI) for facilitating the interaction of the user with the system. The GUI can be a custom and proprietary software developed to help the user with the visualization, control, interfacing, execution, utility of the entire system in a very easy and user-friendly manner without the need for programming. The GUI also acts as the front end of the AI Inference and AI training algorithms.
[0066] In an embodiment, the system 100 can be configured to facilitate execution of
the steps of: generating, by one or more sensors, an input signal pertaining to one or more attributes sensed by the one or more sensors; pre-processing, by one or more processors of a computing engine operatively coupled to the one or more sensors and one or more actuators, the generated input signal to generate a processed signal; classifying, by the one or more processors, the processed signal into at least one of one or more backgrounds by extracting one or more parameters from the processed signal, and comparing the extracted one or more parameters with a dataset comprising a set of prestored parameters associated with corresponding background; and in response to the background classification of the extracted one or more parameters, generating, by the one or more processors, a control signal, where the control signal comprises information pertaining to selecting of a set of actuators of the one or more actuators and a remedial action to be taken, and wherein the selected set of

actuators, on receipt of the generated control signal, execute the remedial action corresponding to one or more electronic items operatively coupled to the one or more actuators.
[0067] In an embodiment, the system 100 can facilitate updating or training the
dataset which includes the steps of generating, by the one or more sensors, the input signal pertaining to the one or more attributes sensed by the one or more sensors; pre-processing, by the one or more processors, the generated input signal to generate a processed signal; classifying, by the one or more processors, the processed signal into at least one new background by extracting information from a set of data packets received from a computing device associated with a user; and in response to the new background classification, generating, by the one or more processors, a new remedial action based on the processed signal, where the new remedial action comprises information pertaining to selecting of a first set of actuators of the one or more actuators, and controlling operation of corresponding one or more electronic items using the first set of actuators.
[0068] FIG. 2 illustrates an exemplary functional diagram for executing pre-
configurable tasks in accordance with an embodiment of the present disclosure.
[0069] In an aspect, module diagram 200 of the computing engine 102 may comprise
one or more processor(s) 202. The one or more processor(s) 202 may be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, logic circuitries, and/or any devices that manipulate data based on operational instructions. Among other capabilities, the one or more processor(s) 202 are configured to fetch and execute computer-readable instructions stored in a memory 204 of the computing engine 102. The memory 204 may store one or more computer-readable instructions or routines, which may be fetched and executed to create or share the data units over a network service. The memory 204 may comprise any non-transitory storage device including, for example, volatile memory such as RAM, or non-volatile memory such as EPROM, flash memory, and the like.
[0070] The computing engine 102 may also comprise an interface(s) 206. The
interface(s) 206 may comprise a variety of interfaces, for example, interfaces for data input and output devices, referred to as I/O devices, storage devices, and the like. The interface(s) 206 may facilitate communication of computing engine 102. The interface(s) 206 may also provide a communication pathway for one or more components of the computing engine 102. Examples of such components include, but are not limited to, processing engine(s) 208 and data 210.

[0071] The processing engine(s) 208 may be implemented as a combination of
hardware and programming (for example, programmable instructions) to implement one or
more functionalities of the processing engine(s) 208. In examples described herein, such
combinations of hardware and programming may be implemented in several different ways.
For example, the programming for the processing engine(s) 208 may be processor executable
instructions stored on a non-transitory machine-readable storage medium and the hardware
for the processing engine(s) 208 may comprise a processing resource (for example, one or
more processors), to execute such instructions. In the present examples, the machine-readable
storage medium may store instructions that, when executed by the processing resource,
implement the processing engine(s) 208. In such examples, the computing engine 102 may
comprise the machine-readable storage medium storing the instructions and the processing
resource to execute the instructions, or the machine-readable storage medium may be
separate but accessible to computing engine 102 and the processing resource. In other
examples, the processing engine(s) 208 may be implemented by electronic circuitry.
[0072] The data 210 may comprise data that is either stored or generated as a result of
functionalities implemented by any of the components of the processing engine(s) 208 or the computing engine 102.
[0073] In an exemplary embodiment, the processing engine(s) 208 may include a pre-
processing engine 212, a background classification engine 214, a remedial action execution engine 216, and other engine(s) 218.
[0074] In an embodiment, the pre-processing engine 212 configured to receive the
input signal generated by the sensors based on the sensed set of attributes. In an embodiment,
the input signals received can be pre-processed by the pre-processing engine 212.
[0075] In an embodiment, the attributes sensed is dependent on type of the sensors.
For example, if the sensor is an optical sensor, then it will be used for capturing various optical attributes such as capturing an image. In another example, if the sensor is an audio sensor, then it will be used for capturing various audio attributes such as capturing audible sample.
[0076] In an embodiment, the pre-processing engine 212 can be configured to
enhance the captured or sensed attributes. The various enhancements that can be performed can include but not limiting in any condition to enhancement, resizing, correction and reduction of data information of the input signal. In an embodiment, the various enhancements can be dependent on type of attributes captured or sensed by the sensors. For example, if the sensor is an image sensor, then the enhancement can include contrast

stretching, which is an efficient as well as a computationally cheap technique implemented to
enhance image quality and image resizing etc. In an embodiment, the pre-processingcan help
ensure faster processing.
[0077] Those skilled in the art would appreciate that the pre-processing engine
212focuses on enhancement and performs certain operations on the input image frames to
ensure that processing in subsequent stages though implementation of various other engines
is performed in less computational time. The pre-processing engine 212 also plays a vital role
in ensuring a higher accuracy rate in executing various tasks efficiently.
[0078] In an embodiment, the background classification engine 214 can be used for
classification of a real-time background based on the sensed set of attributes. In an
embodiment, the background or backdrop can be explained as the secondary task or object
when sensing or capturing or analysing a primary designated task or object.
[0079] In an embodiment, the background classification engine 214 can be configured
to extract various parameters from the processed signals received from the pre-processing
engine 212. Further, the extracted parameters can be compared with a dataset comprising
various pre-stored parameters such that each of the pre-stored parameters does have a
background associated with it. The comparison can help determine a suitable background for
the instant sensed or captured attributes.
[0080] In an embodiment, the remedial action execution engine 216 based on the
classification of the background and the processed signals can generate a control signal to be
transmitted to the actuators to facilitate execution of remedial action in controlling the
electronic items operatively coupled to the actuators.
[0081] In an embodiment, based on the background classification can facilitate
parsing of the extracted features, where the various parsing steps can include pattern
recognition, pattern detection, pattern classification, pattern forecasting and the like using the
shallow, or deep networks, a convolution neural network (CNN), support vector machine
(SVM), and graphs etc.
[0082] In an embodiment, the remedial action execution engine 216 can help select at
least one actuation pipeline from a dataset of pipelines comprising various pipelines for
performing various functions based on the selected at least one actuation pipeline that can
help facilitate taking a remedial action.
[0083] FIG. 3 A is a flow diagram illustrating a process for executing pre-configurable
tasks in accordance with an embodiment of the present disclosure.

[0084] In an aspect, the proposed method may be described in the general context of
computer-executable instructions. Generally, computer-executable instructions can include
routines, programs, objects, components, data structures, procedures, modules, functions,
etc., that perform particular functions or implement particular abstract data types. The method
can also be practised in a distributed computing environment where functions are performed
by remote processing devices that are linked through a communications network. In a
distributed computing environment, computer-executable instructions may be located in both
local and remote computer storage media, including memory storage devices.
[0085] The order in which the method as described is not intended to be construed as
a limitation and any number of the described method blocks may be combined in any order to implement the method or alternate methods. Additionally, individual blocks may be deleted from the method without departing from the spirit and scope of the subject matter described herein. Furthermore, the method may be implemented in any suitable hardware, software, firmware, or combination thereof. However, for ease of explanation, in the embodiments described below, the method may be considered to be implemented in the above-described system.
[0086] In context of the flow diagram, 300 step 302 pertains to generating, by one or
more sensors, an input signal pertaining to one or more attributes sensed by the one or more sensors. The sensors may include cameras, microphones, optical/spectral sensors, piezoelectric, temperature/thermal/IR/NIR sensors, pressure sensors etc. The sensors can be used for sensing various attributes specific to the type of sensor selected to generate an input signal indicative of the sensed various set of attributes by the sensors. In an embodiment, the attributes sensed is dependent on type of the sensors. For example, if the sensor is an optical sensor then it will be used for capturing various optical attributes such as capturing an image. In another example, if the sensor is an audio sensor, then it will be used for capturing various audio attributes such as capturing audible sample.
[0087] Further, step 304 pertains to pre-processing, by one or more processors of a
computing engine operatively coupled to the one or more sensors and one or more actuators, the generated input signal to generate a processed signal further to enhance the captured or sensed attributes. The various enhancements that can be performed can include but not limiting in any condition to enhancement, resizing, correction and reduction of data information of the input signal. In an embodiment, the various enhancements can be dependent on type of attributes captured or sensed by the sensors. For example, if the sensor is an image sensor, then the enhancement can include contrast stretching, which is an

efficient as well as a computationally cheap technique implemented to enhance image quality
and image resizing etc. In an embodiment, the pre-processing can help ensure faster
processing
[0088] Further, step 306 pertains to classifying, by the one or more processors, the
processed signal into at least one of one or more backgrounds by extracting one or more
parameters from the processed signal, and comparing the extracted one or more parameters
with a dataset comprising a set of pre-stored parameters associated with corresponding
background. In response to the background classification of the extracted one or more
parameters, step 308 pertains to generating, by the one or more processors, a control signal,
where the control signal comprises information pertaining to selecting of a set of actuators of
the one or more actuators and a remedial action to be taken, and wherein the selected set of
actuators, on receipt of the generated control signal, execute the remedial action
corresponding to one or more electronic items operatively coupled to the one or more
actuators.
[0089] FIG. 3B is a flow diagram illustrating a process for training a database in
accordance with an embodiment of the present disclosure.
[0090] In an aspect, the proposed method may be described in the general context of
computer-executable instructions. Generally, computer executable instructions can include
routines, programs, objects, components, data structures, procedures, modules, functions,
etc., that perform particular functions or implement particular abstract data types. The method
can also be practiced in a distributed computing environment where functions are performed
by remote processing devices that are linked through a communications network. In a
distributed computing environment, computer-executable instructions may be located in both
local and remote computer storage media, including memory storage devices.
[0091] The order in which the method as described is not intended to be construed as
a limitation and any number of the described method blocks may be combined in any order to
implement the method or alternate methods. Additionally, individual blocks may be deleted
from the method without departing from the spirit and scope of the subject matter described
herein. Furthermore, the method may be implemented in any suitable hardware, software,
firmware, or combination thereof. However, for ease of explanation, in the embodiments
described below, the method may be considered to be implemented in the above-described
system.
[0092] In context of the flow diagram 350 step 352 pertains to generating, by the one
or more sensors, the input signal pertaining to the one or more attributes sensed by the one or

more sensors. Further, step 354 pertains to pre-processing, by the one or more processors, the
generated input signal to generate a processed signal. Further, step 356 pertains to classifying,
by the one or more processors, the processed signal into at least one new background by
extracting information from a set of data packets received from a computing device
associated with a user. Further, in response to the new background classificationstep 358
pertains to, generating, by the one or more processors, a new remedial action based on the
processed signal, where the new remedial action comprises information pertaining to
selecting of a first set of actuators of the one or more actuators, and controlling operation of
corresponding one or more electronic items using the first set of actuators.
[0093] FIG. 4 illustrates an exemplary graphical user interface in accordance with an
embodiment of the present disclosure.
[0094] In an embodiment, Screenshot of the proposed graphical user interface (GUI)
400 shows an example picture 402 as captured by the sensor (camera) and displayed live in
the GUI. 404inference decision determined by the computing engine displayed on top of the
recognized objects live. Two inference decisions (human and cycle) are shown in the block
404 as example case. The GUI can comprise buttons "AI Tell Me" 406 "AI Magic Tasks"
408 etc. On pressing "AI Tell Me" 406 inference decision can be displayed and on pressing
"AI Magic Tasks" 408a user can be given an option to configure the desired Actuations.
[0095] FIG. 5 illustrates an exemplary process of inference mode of operation in
accordance with an embodiment of the present disclosure.
[0096] In an embodiment, as disclosed schematic of the Inference pipeline
implemented in the current embodiment of the invention. The Inference pipeline involves participation of all elements - sensor, AI Engine, AI Actuation Hardware board, external actuators, peripherals etc. GUI helps to manage the workflow and input user-defined settings. Further, block 502 pertains to capturing of data by the sensors. Block 504 pertains to pre-processing of the data that include feature extraction, enhancement, resizing, error correction etc. based on the captured or sensed data. Block 506 pertains to parsing of the extracted information for selection of an appropriate actuation pipeline specific to the sensed data by the sensors based on various techniques that includes pattern recognition, pattern detection, pattern classification, pattern forecasting and the like using the shallow, or deep networks, a convolution neural network (CNN), support vector machine (SVM), and graphs etc. further, block 508 pertains to generation of the control signal based on the determined pipeline. Now, the generated output signal can be transmitted to the actuators for controlling the various electronic items configured with them as disclosed by block 510.

[0097] FIG. 6 illustrates an exemplary process of training mode of operation in
accordance with an embodiment of the present disclosure.
[0098] In context of the present example, during training mode, the AI network(s) or
model(s) inside the computing engine can be trained to detect and recognize new patterns from different forms of data (visual, sound etc.) on-the-fly.
[0099] In an embodiment, block 602 pertains to capturing on sensor data using the
sensors. Block 604 pertains to pre-processing of the captured data to generate a processed signal. Block 606 pertains to setting up a background based on the set of sensor input signals captured by the sensors. Block 608 pertains to detection of patterns etc. Further, block 610 pertains to feature extraction from the processed signal. Further, block 612 pertains to model generator for generating an execution pipeline for performing a remedial action using the actuators.
[00100] Fig. 7 illustrates an exemplary training mode in accordance with an
embodiment of the present disclosure.
[00101] In context of the present example, block 702 pertains to capturing of the
images using a camera sensor. During this stage raw data (on which new patterns are to be trained) is transmitted from the sensor to the computing engine. Block 704 pertains to pre-processing of the raw data by executing different set of instructions, models, networks pertains to. Further block 706 pertains to setting up of background definition. In this step the system can accept the set of data packets of information from a user to define a data-type and situation-specific background condition for the training of the dataset. In an embodiment, for visual data, the background refers to background scene image while for audio data the background refers to ambient background noise. The camera sensor is again used to capture the background data. Based on the receipt of the set of data packets from the user can be used to activate the background definition mode by using the developed training-GUI and selecting the option of "Set Background". Once the background set action is complete, new patterns can be trained.
[00102] Further, block 708 pertains to object detection using the camera sensor to
capture a direct instance of the new pattern. For example, in current case of visual data this maybe a new object of a specific kind that needs to be recognized, in case of audio data this might be the speech of a specific person that needs to be recognized etc. Block 710 pertains to recording of specific pattern. Based on receipt of a second set of data packets of information received from the user pertaining to "train" option on the GUI and add a label or class name for the specific pattern. The pre-built pattern detector can automatically focus on

the pattern of interest and discard additional peripheral information. After adding pattern labels, an AI model generator block inside the computing engine can extract key features of the pattern and update the model to recognize a similar pattern from new data as disclosed by bock 712. This completes the training process.
[00103] FIG. 8 illustrates an exemplary execution of remedial action using an
exemplary actuation pipeline in accordance with an embodiment of the present disclosure.
[00104] In an embodiment, according to block 802, the pre-trained AI model/network
inside the computing engine can be used to drive or control real-world objects using the
actuation pipeline of the proposed invention in real-time. This is achieved with the help of the
specially developed Actuation Hardware Circuit Board. When sensor data is transmitted to
the computing engine and Inference is requested as stated by block 804 the generated
inference decision is passed from the computing engine to the AI Actuation Hardware Circuit
board as shown by block 806. Using the GUI the user can assign specific types of actuations
to specific types of inferences. In live inference the circuit on the actuation board will
translate the logical inference output from the computing Engine to the electrical equivalent
required to drive or operate the specific actuation as disclosed by block 808.
[00105] In an exemplary embodiment, the user can enable the actuation by assigning a
specific pattern class to a specific actuation action. Duration or intensity level of the actuation
can also be controlled as per user requirements. Once the actuation assignment in the GUI
menu is complete SET button is pressed to save the configuration. The actuation board also
has the provision to control external microcontrollers, microprocessors by suppling the
actuation signal using interfaces such as I2C and IP port etc as shown by block 810.
[00106] FIG. 9 illustrates an exemplary implementation using a camera and a toy car in
accordance with an embodiment of the present disclosure.
[00107] Image showing the control of the external robotic car using the proposed
invention. In context of the present example, the computing engine 902 can be connected to the AI Hardware actuation board through ribbon cables and to a camera sensor 904 through USB connection. The camera is mounted on the external robotic car. The I2C ports on the AI hardware actuation board can be connected to an external microcontroller (Arduino) which can be mounted on the robotic car in this case. When a pre-trained object appears in the frame of the camera, it can first be recognized by the computing engine 902. The inference (name of the recognized object) is transmitted to the AI actuation board. The Actuation board then induces actuations that have been predefined by the user. The actuation board further communicates with the controller of the robotic car using the I2C pins. All settings,

assignments can be done by the user using the developed GUI. Finally, the motors of the car
can be controlled as per the control signal generated by the computing engine 902.
[00108] In an embodiment, the hardware-agnostic as the interfaces provided on the AI
actuation board are generic, and GUI is programming free. This drastically improves the
usability of the invention. Single training points can be used to train new patterns. This
reduces the need for large amount of datasets. The inference decision is accompanied by a
confidence value. The proposed system provides real-time association of Al-based inference
and real-world action/actuation. The proposed invention also offers a provision to train the AI
model on new patterns. The GUI, inferencing and actuation board can be used for higher-
level actuation tasks such as secondary analytics like pattern counting, anomaly detection,
forecasting, change detection etc. Another key advantage of the proposed system is security.
Most Al-based models/networks in present-day utilize cloud infrastructure where data from
the sensor has to be sent to a cloud or datacentre. This causes privacy/security vulnerability
and leads to high data transmission costs. The could-based AI also adds to inference decision
latency in the entire pipeline. In the proposed system the training, inference and actuation all
actions are executed 100% on edge and on the end devices without the involvement of any
cloud or datacentres. This makes the data pipeline more secure, low-power and cost-effective.
[00109] Although the proposed system has been elaborated as above to include all the
main parts, it is completely possible that actual implementations may include only a part of
the proposed modules/engines or a combination of those or a division of those in various
combinations across multiple devices that can be operatively coupled with each other,
including in the cloud. Further, the modules/engines can be configured in any sequence to
achieve objectives elaborated. Also, it can be appreciated that proposed system can be
configured in a computing device or across a plurality of computing devices operatively
connected with each other, wherein the computing devices can be any of a computer, a
laptop, a smart phone, an Internet enabled mobile device and the like. All such modifications
and embodiments are completely within the scope of the present disclosure.
[00110] Embodiments of the present disclosure may be implemented entirely
hardware, entirely software (including firmware, resident software, micro-code, etc.) or combining software and hardware implementation that may all generally be referred to herein as a "circuit," "module," "component," or "system." Furthermore, aspects of the present disclosure may take the form of a computer program product comprising one or more computer-readable media having computer-readable program code embodied thereon.

[00111] Thus, it will be appreciated by those of ordinary skill in the art that the
diagrams, schematics, illustrations, and the like represent conceptual views or processes illustrating systems and methods embodying this invention. The functions of the various elements shown in the figures may be provided through the use of dedicated hardware as well as hardware capable of executing associated software. Similarly, any switches shown in the figures are conceptual only. Their function may be carried out through the operation of program logic, through dedicated logic, through the interaction of program control and dedicated logic, or even manually, the particular technique being selectable by the entity implementing this invention. Those of ordinary skill in the art further understand that the exemplary hardware, software, processes, methods, and/or operating systems described herein are for illustrative purposes and, thus, are not intended to be limited to any particular named.
[00112] As used herein, and unless the context dictates otherwise, the term "coupled
to" is intended to include both direct coupling (in which two elements that are coupled to each other contact each other) and indirect coupling (in which at least one additional element is located between the two elements). Therefore, the terms "coupled to" and "coupled with" are used synonymously. Within the context of this document terms "coupled to" and "coupled with" are also used euphemistically to mean "communicatively coupled with" over a network, where two or more devices are able to exchange data with each other over the network, possibly via one or more intermediary device.
[00113] It should be apparent to those skilled in the art that many more modifications
besides those already described are possible without departing from the inventive concepts herein. The inventive subject matter, therefore, is not to be restricted except in the spirit of the appended claims. Moreover, in interpreting both the specification and the claims, all terms should be interpreted in the broadest possible manner consistent with the context. In particular, the terms "comprises" and "comprising" should be interpreted as referring to elements, components, or steps in a non-exclusive manner, indicating that the referenced elements, components, or steps may be present, or utilized, or combined with other elements, components, or steps that are not expressly referenced. Where the specification claims refers to at least one of something selected from the group consisting of A, B, C .... and N, the text should be interpreted as requiring only one element from the group, not A plus N, or B plus N, etc.
[00114] While the foregoing describes various embodiments of the invention, other and
further embodiments of the invention may be devised without departing from the basic scope

thereof. The scope of the invention is determined by the claims that follow. The invention is not limited to the described embodiments, versions or examples, which are included to enable people having ordinary skill in the art to make and use the invention when combined with information and knowledge available to the person having ordinary skill in the art.
ADVANTAGES OF THE PRESENT DISCLOSURE
[00115] The present disclosure provides system and method for executing configurable
tasks.
[00116] The present disclosure provides system and method that enables a ready to use
a feature, that can enable a non-professional user to perform experimentation, invention and
the like.
[00117] The present disclosure provides system and method that enables training or
updating of the database for enhancing the usability.
[00118] The present disclosure provides system and method for executing configurable
tasks that does not require exchanging data packets of information with a remote server, thus
processing is faster at enhanced security.
[00119] The present disclosure provides system and method for executing configurable
tasks that reduces or prevents cost incurred due to network data expenditure.
[00120] It is yet another object of the present disclosure to provide system and method
for executing configurable tasks that is cost-effective and easy to implement.

We Claim:

1.A method comprising the steps of:
generating, by one or more sensors, an input signal pertaining to one or more attributes sensed by the one or more sensors;
pre-processing, by one or more processors of a computing engine operatively coupled to the one or more sensors and one or more actuators, the generated input signal to generate a processed signal;
classifying, by the one or more processors, the processed signal into at least one of one or more backgrounds by extracting one or more parameters from the processed signal, and comparing the extracted one or more parameters with a dataset comprising a set of prestored parameters associated with corresponding background; and
in response to the background classification of the extracted one or more parameters, generating, by the one or more processors, a control signal, where the control signal comprises information pertaining to selecting of a set of actuators of the one or more actuators and a remedial action to be taken, and wherein the selected set of actuators, on receipt of the generated control signal, execute the remedial action corresponding to one or more electronic items operatively coupled to the one or more actuators.
2. The method as claimed in claim 1, wherein the method comprises updating the dataset,
said updating the dataset comprises the steps of:
generating, by the one or more sensors, the input signal pertaining to the one or more attributes sensed by the one or more sensors;
pre-processing, by the one or more processors, the generated input signal to generate a processed signal;
classifying, by the one or more processors, the processed signal into at least one new background by extracting information from a set of data packets received from a computing device associated with a user; and
in response to the new background classification, generating, by the one or more processors, a new remedial action based on the processed signal, where the new remedial action comprises information pertaining to selecting of a first set of actuators of the one or more actuators, and controlling operation of corresponding one or more electronic items using the first set of actuators.

3. The method as claimed in claim 1, wherein the pre-processing step comprises any or a combination of enhancement, resizing, correction and reduction of data information of the input signal.
4. The method as claimed in claim 1, wherein each of the one or more sensors comprises any or a combination of an optical sensor, a microphone, a spectral sensor, a piezoelectric sensor, a temperature sensor, a thermal imaging sensor, an infrared sensor, a near infrared sensors, a pressure sensor and a proximity sensor.
5. The method as claimed in claim 1, wherein the one or more actuators comprises a relay, a switch, an electrical actuator, a mechanical actuator and an electro-mechanical actuator.
6. The method as claimed in claim 1, wherein the remedial action comprises controlling
operation of the one or more electronic items using the one or more actuators.
7. The method as claimed in claim 1, wherein the one or more electronic items comprises a Liquid Crystal Display, a Light Emitting Diode, a motor, a buzzer, an audio unit and a video unit.
8. The method as claimed in claim 1, wherein the method comprises estimating accuracy in executing the remedial action.
9. A system comprising:
one or more actuators configured with one or more electronic items; one or more sensors configured to generate an input signal pertaining to one or more attributes sensed by the one or more sensors; and
a computing engine operatively coupled with the one or more sensors and the one or more actuators, the control unit comprises one or more processors coupled to a memory, the memory storing instructions executable by the one or more processors to: pre-process the generated input signal to generate a processed signal; classify the processed signal into at least one of one or more background by extracting one or more parameters, and comparing the extracted one or more parameters with a dataset comprising a set of prestored parameters associated with corresponding background; and
in response to the background classification of the extracted one or
more parameters, generate a control signal, where the control signal
comprises information pertaining to selecting of a set of actuators of the one
or more actuators and a remedial action to be taken; and
wherein the selected set of actuators, on receipt of the generated control signal,
execute the remedial action corresponding to the one or more electronic items.

10. The system as claimed in claim 9, wherein the system comprises an input unit operatively coupled to the computing engine to enable interaction of a user with the computing engine, the input unit comprises a keyboard, a mouse, a stylus, an audio input unit, a visual input unit and a touch enabled display.

Documents

Application Documents

# Name Date
1 201911043811-STATEMENT OF UNDERTAKING (FORM 3) [29-10-2019(online)].pdf 2019-10-29
2 201911043811-FORM FOR STARTUP [29-10-2019(online)].pdf 2019-10-29
3 201911043811-FORM FOR SMALL ENTITY(FORM-28) [29-10-2019(online)].pdf 2019-10-29
4 201911043811-FORM 1 [29-10-2019(online)].pdf 2019-10-29
5 201911043811-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [29-10-2019(online)].pdf 2019-10-29
6 201911043811-EVIDENCE FOR REGISTRATION UNDER SSI [29-10-2019(online)].pdf 2019-10-29
7 201911043811-DRAWINGS [29-10-2019(online)].pdf 2019-10-29
8 201911043811-DECLARATION OF INVENTORSHIP (FORM 5) [29-10-2019(online)].pdf 2019-10-29
9 201911043811-COMPLETE SPECIFICATION [29-10-2019(online)].pdf 2019-10-29
10 abstract.jpg 2019-10-30
11 201911043811-FORM-9 [28-11-2019(online)].pdf 2019-11-28
12 201911043811-STARTUP [29-11-2019(online)].pdf 2019-11-29
13 201911043811-FORM28 [29-11-2019(online)].pdf 2019-11-29
14 201911043811-FORM 18A [29-11-2019(online)].pdf 2019-11-29
15 201911043811-Proof of Right (MANDATORY) [06-12-2019(online)].pdf 2019-12-06
16 201911043811-FORM-26 [18-12-2019(online)].pdf 2019-12-18
17 201911043811-FER.pdf 2020-01-23
18 201911043811-FER_SER_REPLY [13-03-2020(online)].pdf 2020-03-13
19 201911043811-DRAWING [13-03-2020(online)].pdf 2020-03-13
20 201911043811-CORRESPONDENCE [13-03-2020(online)].pdf 2020-03-13
21 201911043811-COMPLETE SPECIFICATION [13-03-2020(online)].pdf 2020-03-13
22 201911043811-CLAIMS [13-03-2020(online)].pdf 2020-03-13
23 201911043811-ABSTRACT [13-03-2020(online)].pdf 2020-03-13
24 201911043811-US(14)-HearingNotice-(HearingDate-09-07-2020).pdf 2020-06-11
25 201911043811-Correspondence to notify the Controller [02-07-2020(online)].pdf 2020-07-02
26 201911043811-Written submissions and relevant documents [24-07-2020(online)].pdf 2020-07-24
27 201911043811-FORM-26 [24-07-2020(online)].pdf 2020-07-24
28 201911043811-Annexure [24-07-2020(online)].pdf 2020-07-24
29 201911043811-Response to office action [20-08-2020(online)].pdf 2020-08-20
30 201911043811-PatentCertificate07-09-2020.pdf 2020-09-07
31 201911043811-IntimationOfGrant07-09-2020.pdf 2020-09-07
32 201911043811-RELEVANT DOCUMENTS [23-09-2022(online)].pdf 2022-09-23
33 201911043811-RELEVANT DOCUMENTS [19-07-2023(online)].pdf 2023-07-19

Search Strategy

1 201911043811_search_22-01-2020.pdf

ERegister / Renewals

3rd: 11 Sep 2020

From 29/10/2021 - To 29/10/2022

4th: 11 Sep 2020

From 29/10/2022 - To 29/10/2023

5th: 17 Oct 2023

From 29/10/2023 - To 29/10/2024

6th: 29 Aug 2024

From 29/10/2024 - To 29/10/2025

7th: 29 Aug 2024

From 29/10/2025 - To 29/10/2026

8th: 29 Aug 2024

From 29/10/2026 - To 29/10/2027

9th: 29 Aug 2024

From 29/10/2027 - To 29/10/2028

10th: 29 Aug 2024

From 29/10/2028 - To 29/10/2029