Sign In to Follow Application
View All Documents & Correspondence

Augmented Reality Based Projection Of Printed Objects

Abstract: A system and method for providing an augmented reality based projection is provided. The augmented reality system is configured to create and/or load a three dimensional model; capture an underlying 2D image of a printed material to be projected via an image transformation interface; create a reference image from the captured image; use the reference image as a texture map for mapping UV’s of each vertex of the 3D model to the corresponding part of the reference image to create texture for projection on the underlying 2D image; and produce an interactive augmented reality representation on the underlying 2D image having the texture for display. FIG. 3

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
20 October 2015
Publication Number
21/2017
Publication Type
INA
Invention Field
PHYSICS
Status
Email
email@ipneeti.com
Parent Application
Patent Number
Legal Status
Grant Date
2018-11-29
Renewal Date

Applicants

Smartivity Labs Private Limited
1st Floor, 258, Kuldeep House, Lane No. 3, Westend Marg, Saidulajab, New Delhi-110030

Inventors

1. Verma, Akash
ND 17 Zanskar, IIT Delhi, Hauz Khas, New Delhi-110016
2. Pangtey, Somit
Bisht Dhara Bithoria-1, Near By Pass Road, Haldwani, Nainital, Uttarakhand-263139

Specification

FIELD OF INVENTION
[001] The present invention relates to digital image processing and
more particularly, the invention relates to augmented reality based projection of
printed objects.
5
BACKGROUND
[002] Augmented reality (AR) is a supplemented live view of a
physical, real-world environment by a computer-generated sensory input
such as sound, video, graphics, etc. Augmented Reality Systems (ARS) may
10 use video cameras and/or other sensor modalities to reconstruct the
camera's position and orientation (pose) in the world and may recognize the
pose of an object for augmentation. This information may then be used to
generate synthetic imagery that may be properly registered (aligned) to the
world as viewed by the camera. The end user may be able to view and
15 interact with this augmented imagery in such a way as to provide additional
information about the objects in their view, or the world around them.
[003] However, the technology may not work for printed materials
like images, texts, etc. to provide an augmented three-dimensional view of
the printed materials. Therefore, there is a need for a system that obviates
20 the above drawbacks and provides a system and method for mapping
augmented reality objects to a printed material.
SUMMARY
[004] In accordance with the invention, a system and method for
25 providing an augmented reality based projection is provided. The augmented
reality system is configured to create and/or load a three dimensional model;
capture an underlying 2D image of a printed material to be projected via an
3
image transformation interface; create a reference image from the captured
image; use the reference image as a texture map for mapping UV’s of each
vertex of the 3D model to the corresponding part of the reference image to
create texture for projection on the underlying 2D image; and produce an
interactive augmented reality representation on 5 the underlying 2D image
having the texture for display.
BRIEF DESCRIPTION OF THE DRAWINGS
[005] The following drawings are illustrative of embodiments of the
10 system developed or adapted using the teachings of at least one of the
inventions disclosed herein and are not meant to limit the scope of the
invention as encompassed by the claims.
[006] FIG. 1a illustrates a client-server architecture illustrating
connections between a client and server via a network in the context of an
15 augmented reality system.
[007] FIG. 1b illustrates an alternate client-server architecture
illustrating connections between a client and server via a network in the
context of the augmented reality system.
[008] FIG. 1c illustrates another embodiment of the augmented
20 reality system.
[009] FIG. 2 illustrates a modular block diagram of image
transformation engine.
[0010] FIG. 3 illustrates process steps for 3D projection on the
underlying 2D image.
25 [0011] FIG. 4 illustrates a flow diagram showing steps for image
extraction and removal of deformation in an image.
4
[0012] FIG. 5 illustrates a flow diagram depicting steps for texture
generation.
DETAILED DESCRIPTION OF THE EMBODIMENTS
[0013] The present invention provides 5 an augmented reality based
projection of printed objects. The present invention allows for gameplay and/or
training to contain augmented special effects. The invention creates a three
dimensional (3D) customized projection of a two-dimensional (2D) image from
the two-dimensional image itself. The two-dimensional image may be provided
10 on a coloring sheet, board, puzzles, flash cards, bots, books, etc. The coloring
sheet may be a sheet having a pre-defined design on which a user may put
colors of choice. However, the invention is not restricted to the above. A user
may use a tangible medium such as on art canvas, sketchpad, page of a book,
etc. for coloring. It will be understood that any other suitable types of data in an
15 intangible medium may also be presented by the invention. Examples include,
but are not limited to, two-dimensional models, images, documents,
spreadsheets, presentations, electronic drawings, etc. Thus, a two-dimensional
image (whether on tangible or intangible object) which is to be projected is
referred to as an underlying 2D image. The 3D projection appears on top of the
20 coloring sheet when viewed by a device such as a mobile phone, an electronic
tab, etc. In an embodiment, the 3D projection scales based on the proximity of
the device held by the user to the underlying 2D image. In an embodiment, the
invention allows a user to interact with 3D objects provided on the underlying
2D image and provides access to learning games based on the coloring sheet
25 which is being used.
[0014] Applications, software programs or computer readable
instructions may be referred to as components or modules. Applications may be
hardwired or hard coded in hardware or take the form of software executing on
5
a general purpose computer, such that, when the software is loaded into and/or
executed by the computer, the computer becomes an apparatus for practicing
the invention, or they are available via a web service. Applications may also be
downloaded in whole or in part through the use of a software development kit
or a toolkit that enables the creation and implementation 5 of the present
invention. In this specification, these implementations, or any other form that
the invention may take, may be referred to as techniques. In general, the order
of the steps of the disclosed processes may be altered within the scope of the
invention.
10 [0015] Referring now to the drawings, FIG. 1a is an augmented reality
system 100 configured as client/server architecture used in an embodiment of
the present disclosure. A “client device” is a member of a class or group that
uses the services of another class or group to which it is not related. In the
context of a computer network, a client device is a process (i.e. roughly a
15 program or task) that requests a service which is provided by another process,
known as a server program. The process at the client device uses the requested
service without knowing any working details about the server program or the
server itself. In a networked system, a process at the client device usually runs
on a computer that accesses shared network resources provided by another
20 computer running a corresponding server process.
[0016] In FIG. 1a, the system 100 for practicing the teachings of the
present invention includes one or more client devices 10, one or more servers
20, a database 40 and a network 50 which is used for establishing
communication between the client device 10 and server 20. Each of the client
25 device 10 and the server 20 may include an application via which the
teachings of the present invention may be practiced.
[0017] The client device 10 may be any electronic device such as a
personal computer or any other type of personal communication device such
6
as tablet computer, smartphone, digital media player, or gaming console,
among other examples. The client device 10 may comprise of a display 13, a
memory 15, a processor 17, an optional input/output controller 19, a
communication interface 21, and an image transformation interface 23 (part
of 5 a client-side application).
[0018] The display 13 may be a discrete component interactively
linked to the client device 10, or may be an integral feature of the client
device 10. The display 13 may be and not limited to electroluminescent
displays (ELD), light emitting diode displays (LED), cathode ray tube (CRT),
10 liquid-crystal display (LCD), plasma display panel (PDP) etc. The display 13
displays to the user a 3D representation or augmented reality representation
of the underlying 2D image having texture(s).
[0019] The processor 17 may be a central processing unit (CPU) which
runs an operating system and executes the application program stored in
15 memory 15. The processor 17 collaborates, coordinates and controls all other
modules for proper and smooth functioning. The memory 15 further stores
the digital image taken by the image transformation interface 23 for further
processing either by the processor 17 or by the server 20.
[0020] The optional input/output controller 19 is used for controlling
20 the information being displayed on the client device 10. The optional input
ports and devices may be universal serial bus (USB) ports, secure digital (SD)
card reader, fire-wire ports, lightning ports, serial ports, parallel ports, local
area network (LAN)/ wide area network (WAN) port, microphone ports, etc.
[0021] The communication interface 21 is used to provide the client
25 device 10 a dedicated, full-time connection to the network 50.
[0022] The image transformation interface 23 may provide the user
an interface to create or modify one or more images created by using
electronic painting, drawing, etc. The user may modify a pre-existing image
7
template, for example, by adding features or colors to the image template by
using the image transformation interface 23. The feature includes without
limitation colors, arrangements, relative positioning, and orientation.
However, if the image is provided on a tangible object such as a coloring
sheet or book then the image transformation in 5 terface 23 may be a camera,
for example, configured to capture the user created image. The camera may
include, but are not limited to, one or more digital cameras which may be
such as a two-dimensional camera, stereo camera, depth camera, time-offlight
camera, structured light depth camera, etc. The digital camera may
10 include a complementary metal-oxide-semiconductor (CMOS) or charged
coupled device (CCD) image sensor configured to transform the user created
or modified image to produce digital image data for further processing by the
server 20. In another embodiment, the image transformation interface 23
allows the user to select or provide inputs on the underlying 2D image which
15 is to be captured by the image transformation interface 23.
[0023] The network 50 is used for establishing communication
between the client device 10 and the server 20. The network 50 may be a
global area network (GAN), such as the Internet, a wide area network (WAN),
a local area network (LAN), or any other type of network or combination of
20 networks. The communication medium may be wireline, wireless, or a
combination of wireline and wireless communication medium between
devices in the network. In some embodiments of the invention, the
communication medium described herein may be a cloud computing
network.
25 [0024] A “server” is typically a remote computer system that is
accessible over a communication medium. The process at the client device may
be active on a portable device which communicates with a server process via
the network that allows multiple client devices to take advantage of the
8
information-gathering capabilities of the server. Thus, the server essentially acts
as an information provider for a computer network. In an embodiment, the
server 20 may be a heterogeneous server or any other kind of server known in
the art and includes a processor 22, a communication interface 24 and one or
more databases 40. The server includes a server side 5 application which includes
image transformation engine 26, a 2D image module 30 and/or other modules.
A detailed description of each is as follows:
[0025] The processor 22 controls and collaborates the functioning of
all the modules and fetches the required data from the database 40.
10 [0026] The communication interface 24 is used to provide the server
20 a dedicated, full-time connection to the client device 10 via the network
50.
[0027] The image transformation engine 26 receives the underlying
2D image and detects or tracks one or more feature points on the surface of
15 the underlying 2D image. The feature points are obtained by gradient
difference on the image. Detection of feature points may be performed in a
number of different ways. In some instances, for example, a color feature
may be detected through the identification of color added to the image by
the user. In other instances, a color feature and/or surface pattern feature
20 may be detected through identification of a symbol or figure added to image
by the user.
[0028] In some implementations, image transformation engine 26
may be configured to include object recognition capability, thereby enabling
detection of features introduced by the user by being hand drawn markings,
25 such as shapes, on image. These features are then matched with the feature
points of the 3D model in the database.
[0029] In an embodiment, the image transformation engine 26
synthesizes a texture corresponding to the feature points to produce an
9
augmented reality representation using a pre-fed 3D model of the underlying
2D image. In an alternate embodiment, the image transformation engine 26
synthesizes a texture corresponding to the user created image for display to
the user. The texture is synthesized using for example, a UV map (Texture
Map) for the augmented reality 3D models. T 5 he image transformation engine
26 has various modules which would be discussed in detail in FIG. 2.
[0030] The 2D image module 30 creates specially designed coloring
sheets. Each coloring sheet has a unique design like an animal, bird, object,
etc. The 2D image module 30 verifies each newly created coloring sheet to
10 identify if it is recognizable and distinguishable from other coloring sheets
stored in the database 40. For example, the 2D image module checks
whether the feature points of the captured image match with the 3D model’s
2D reference UV map 30. If the 2D image module 30 finds a match, it
considers the captured image as being recognized. Further, the 2D image
15 module 30 creates coloring sheet bundles. A coloring sheet bundle comprises
of a group of similarly themed coloring sheets such as coloring sheet bundle
of landscape, sports, dolls, etc.
[0031] The database 40 may be a heterogeneous database or any
other kind of database known in the art. The database 40 may be externally
20 or internally attached to the server 20. The database 40 stores all the
information related to the underlying 2D image such as underlying 2D image
ID, underlying 2D image bundle ID, objects or characters, colors, background,
accessories, wardrobe, stickers etc. The database 40 also stores all the
information related to the image taken by the image transformation interface
25 23 of the underlying 2D image. The database 40 stores 3D models of images
that can be augmented using the teachings of the present invention. The
image data includes and is not restricted to image ID, objects or characters,
colors, background, accessories, wardrobe, stickers etc.
10
[0032] FIG. 1b shows another view of the system for practicing the
teachings of the present invention. The system shown in FIG. 1b includes one or
more client devices 10, one or more servers 20, a database 40 and a network 50
which is used for establishing communication between the client device 10 and
the server 20. In this embodiment, in addition t 5 o the image transformation
interface 23, the client side application also includes the image transformation
engine 26. The server side application consists of the 2D image module 30. Since
the image transformation engine 26 is on the client device 10, the processing
and transformation of the image is faster. The image transformation engine 26
10 on the client side application gives the user the ability to interact with another
user through the use of any online multiplayer environment hosted on the
server 20. Moreover, in some implementations, image transformation engine 26
is further configured to enable the users to upload the augmented reality
representation of the user created or modified image to augmented reality
15 image library 28, which is accessible to a community of users.
[0033] FIG. 1c depicts an alternate architecture illustrating a client
device. The client side application may consist of an image transformation
interface 23, an image transformation engine 26, and a 2D image module 30.
However, the functionality of all the modules is the same as discussed in FIG. 1a
20 and FIG. 1b and may be referred for details.
[0034] The client device 10 downloads the client side application from
an application store via network 50. The user may also download the coloring
sheets on his/her client device 10 and store in its memory 15. Once the
download is completed and the client side application is installed, then the
25 client side application may run in offline mode without requiring the server 20,
database 40 and the network 30.
[0035] From the above, it is apparent that the present invention can be
implemented as a client server system or a single device system.
11
[0036] FIG. 2 illustrates a modular block diagram of the image
transformation engine 26. The image transformation engine 26 includes various
modules such as a selection module 205, an image extraction module 207, an
UV mapping module 209 and a game module 211. A detailed description of the
5 modules is as follows:
[0037] The selection module 205 receives inputs from the user to
identify the underlying 2D image bundle being used and the underlying 2D
image that the user would like to be projected using the image transformation
interface 23. The selection module 205, on the basis of the user selection,
10 identifies the corresponding 3D model to be used for generating UV map.
[0038] The image extraction module 207 scans and extracts the image of
the underlying 2D image. For extraction, the image extraction module 207
detects one or more feature points (described in FIG. 1a) of the underlying 2D
image. The image extraction module 207 also identifies if the extracted image(s)
15 is distorted or deformed and removes and/or rectifies the extracted image(s).
The detailed steps for removing deformation from the extracted image(s) will be
discussed in FIG. 4.
[0039] The image extraction module 207is configured to apply
homography transformations on the extracted image to transform the image
20 into a reference image for UV mapping. The homography transformation relates
any two images (initial and projected) of the same planar surface in space by
homography. The technique extracts camera rotation and translation from an
estimated homography matrix and inserts models of 3D objects into an image or
video.
25 [0040] The UV mapping module 209 lays texture on the reference image.
The texture synthesis performed by the UV mapping module 209 may be an
example based texture synthesis process that uses the image or image
modification produced by a user as an exemplar. The texture synthesis may also
12
be augmented by techniques such as texture by numbers (TBN), allowing users
to specify various additional constraints and texture areas for high quality,
visually consistent renderings with user defined art direction. In one
implementation, for example, the texture synthesis may include generating a UV
texture for a 3D augmented reality representation 5 of the user created or
modified image, generating a special unique UV texture and generating a UV
map. The texture synthesis process may further include rendering the 3D
augmented reality representation with the special unique UV texture as input
and processing the rendering together with the UV map image.
10 [0041] The game module 211 identifies learning games based on the
underlying 2D image and displays the identified learning games on the display.
The game module 211 in addition to the selected underlying 2D image also uses
the underlying 2D image bundle for identifying learning games. The game
module 211 is activated once the image extraction module 207 is activated and
15 stays active till the underlying 2D image in view of the image transformation
interface 23.
[0042] FIG. 3 illustrates process steps for 3D projection on the
underlying 2D image. At step 302, an image of the underlying 2D image is
captured. Initially, the user captures an image of the printed material or
20 creates a digital image which is to be augmented. This image is referred to as
the underlying 2D image. On the basis of this image, the system identifies
corresponding 3D model stored in the database. Alternately, the 3D model
may be generated in real-time. The captured underlying 2D image is sent to
the image transformation engine 26.
25 [0043] At step 304, the image extraction module 207 extracts feature
points from the underlying 2D image. The extraction refers to the image
recognition and any known techniques for the same may be applied. The
image extraction module 207 then detects and removes
13
distortions/deformations in the obtained image to create a reference image.
The details of removing deformation in the obtained image are discussed in
FIG. 4 and FIG. 5.
[0044] At step 306, the UV mapping module 209, uses the reference
image as the UV map for mapping the UV’s of each 5 vertex of the 3D model to
the corresponding part of the corrected image to create texture for projection
on the underlying 2D image. This maybe referred to as an augmented reality
representation.
[0045] At step 308, the produced augmented reality representation is
10 rendered on the underlying 2D image via the display 13 of the user. Optionally,
the user may interact with the augmented reality representation by entering
inputs from the client device 10. For example, the user may provide inputs
commanding movement of augmented reality representation within augmented
reality scene, and/or interaction by augmented reality representation with other
15 augmented reality representations included in augmented reality
representation.
[0046] FIG. 4 illustrates a flow diagram showing steps for image
extraction and removal of deformation in a captured image. At step 402, the
image transformation engine 26 obtains the underlying 2D image from the
20 image transformation interface 23. The underlying 2D image is made of a
plurality of frames.
[0047] At step 404, the image extraction module 207 of the image
transformation engine 26 creates a custom model view projection matrix for
every frame of the image. The matrix is based on an intrinsic calibration of
25 the image transformation interface 23. This custom model view projection
matrix is created to identify the colored data present in the frame. The
frames are fetched from the database for the selected 3D model.
14
[0048] In an embodiment, a fragment shader is used to create the
custom model view projection matrix. Once the matrix is created, it is
updated till the movement of camera is stalled to ensure that an updated
matrix that corresponds to the final captured image is formed.
[0049] At step 406, the values from 5 the custom model view projection
matrix are used to create a clipped image. The clipped image is created by
clipping the frames. The frames are clipped so that only colored data remains.
[0050] At step 408, the image orientation is calculated to identify the
vertices and geometry of the captured image. This information is used to
10 project an augmented image onto a plane.
[0051] At step 410, the corrected image is stretched to a predefined
dimension. This predefined dimension corresponds to the dimension of the
underlying 2D image. This stretched and corrected image is free of
deformation and/or distortions. Further, the fragment shader is used to fill
15 the stretched image with correct pixel values. The correct pixel values are
obtained from the image obtained from the camera, which is distorted and
deformed. The fragment shader picks up the pixels from the underlying 2D
image and then stretches it onto the final stretched image, which is free from
deformations and distortions.
20 [0052] FIG. 5 illustrates a flow diagram depicting steps for texture
generation. At step 502, a 3D model is loaded. The model can be fetched
from the database or the model may be created in real-time.
[0053] At step 504, the image transformation engine 26 obtains the
underlying 2D image which is captured via the image transformation interface
25 23.
15
[0054] At step 506, the image extraction module 207 of the image
transformation engine 26 extracts the image of the obtained underlying 2D
image. The procedure used to extract the image is detailed in FIG. 2.
[0055] At step 508, the image extraction module 207 identifies if the
image is deformed or not. If the image is deformed 5 or distorted then steps
from 310 are followed. However, if the image is not distorted then step 312 is
performed.
[0056] At step 510, the image extraction module 207 removes all the
deformations and distortions from the image to create a reference image.
10 The details of removing deformation from the image will be discussed in Fig.
4.
[0057] At step 512, the UV mapping module 209 maps UV’s of each
vertex of the 3D model to the corresponding part of reference image to
create a texture of the 3D model. The UV’s maybe stored in a predefined
15 table in the database.
[0058] With the above embodiments in mind, it should be understood
that the embodiments might employ various computer-implemented
operations involving data stored in computer systems. The embodiments also
relate to a device or an apparatus for performing these operations. The
20 apparatus can be specially constructed for the required purpose, or the
apparatus can be a general-purpose computer selectively activated or
configured by a computer program stored in the computer. In particular,
various general-purpose machines can be used with computer programs
written in accordance with the teachings herein, or it may be more
25 convenient to construct a more specialized apparatus to perform the
required operations.
[0059] A module, an application, a layer, an agent or other methodoperable
entity could be implemented as hardware, firmware, or processor
16
executing software, or combinations thereof. It should be appreciated that,
where a software-based embodiment is disclosed herein, the software can be
embodied in a physical machine such as a controller. For example, a
controller could include a first module and a second module. A controller
could be configured to perform 5 various actions, e.g., of a method, an
application, a layer or an agent.
[0060] The embodiments can also be embodied as computer readable
code on a computer readable medium. The computer readable medium is
any data storage device that can store data, which can be thereafter read by
10 a computer system. Examples of the computer readable medium include solid
state drives, hard drives, SD cards, network attached storage (NAS), read-only
memory, random-access memory, CD-ROMs, CD-Rs, CD-RWs, magnetic tapes,
and other optical and non-optical data storage devices. The computer
readable medium can also be distributed over a network coupled computer
15 system so that the computer readable code is stored and executed in a
distributed fashion. Embodiments described herein may be practiced with
various computer system configurations including hand-held devices, tablets,
microprocessor systems, microprocessor-based or programmable consumer
electronics, minicomputers, mainframe computers and the like. The
20 embodiments can also be practiced in distributed computing environments
where tasks are performed by remote processing devices that are linked
through a wire-based or wireless network.
[0061] The foregoing description, for the purpose of explanation, has
been described with reference to specific embodiments. However, the
25 illustrative discussions above are not intended to be exhaustive or to limit the
invention to the precise forms disclosed. Many modifications and variations
are possible in view of the above teachings. The embodiments were chosen
and described in order to best explain the principles of the embodiments and
17
its practical applications, to thereby enable others skilled in the art to best
utilize the embodiments and various modifications as may be suited to the
particular use contemplated. Also, that various presently unforeseen or
unanticipated alternatives, modifications, variations or improvements therein
may be subsequently made by those skilled in the 5 art which are also intended
to be encompassed by the following claims.

We claim:
1. A method for providing an augmented reality based projection, the
method comprising:
creating and/or loading a three dimensional model;
capturing an underlying 2D image of a 5 printed material to be
projected via an image transformation interface;
creating a reference image from the captured image;
using the reference image as a texture map for mapping UV’s of
each vertex of the 3D model to the corresponding part of the reference
10 image to create texture for projection on the underlying 2D image; and
producing an interactive augmented reality representation on the
underlying 2D image having the texture for display.
2. The method as claimed in claim 1, wherein the method comprises
facilitating interaction with the augmented reality representation for playing
15 a game.
3. The method as claimed in claim 1, wherein the creating a reference
image comprises removing one or more distortions from the captured image.
4. The method as claimed in claim 1, further comprising scaling the
augmented reality representation based on the proximity of a device held by
20 a user with the underlying 2D image.
5. The method as claimed in claim 1, wherein the method comprises
storing the texture map in the database.
6. A method for providing an augmented reality based projection by
removing image deformation and /or distortion, the method comprises:
19
creating and/or loading a three dimensional model;
capturing an underlying 2D image via an image transformation
interface;
creating a custom model view projection matrix for each frame of
5 the underlying 2D image;
creating a clipped image by clipping non-colored frames present in
the underlying 2D image;
calculating image orientation to identify vertices and geometry of
the underlying 2D image for projecting on a plane;
10 stretching the clipped image with fixed orientation and geometry
to a predefined dimension on the plane resulting in a reference image;
using the reference image as a texture map for mapping UV’s of
each vertex of the 3D model to the corresponding part of the reference
image to create texture for projection on the underlying 2D image; and
15 producing an interactive augmented reality representation on the
underlying 2D image having the texture for display.
7. An augmented reality system comprises:
an image transformation engine, the engine configured to
create and/or load a three dimensional model;
20 capture an underlying 2D image of a printed material to be
projected via an image transformation interface;
create a reference image from the captured image;
use the reference image as a texture map for mapping UV’s
of each vertex of the 3D model to the corresponding part of the
20
reference image to create texture for projection on the
underlying 2D image; and
produce an interactive augmented reality representation on
the underlying 2D image having the texture for display.

Documents

Orders

Section Controller Decision Date

Application Documents

# Name Date
1 3254-DEL-2015-RELEVANT DOCUMENTS [29-09-2021(online)].pdf 2021-09-29
1 Form 5 [09-10-2015(online)].pdf 2015-10-09
2 3254-DEL-2015-RELEVANT DOCUMENTS [24-03-2020(online)].pdf 2020-03-24
2 Description(Provisional) [09-10-2015(online)].pdf 2015-10-09
3 OnlinePostDating.pdf 2016-10-14
3 3254-DEL-2015-RELEVANT DOCUMENTS [26-03-2019(online)].pdf 2019-03-26
4 Other Patent Document [20-10-2016(online)].pdf 2016-10-20
4 3254-DEL-2015-IntimationOfGrant29-11-2018.pdf 2018-11-29
5 Form 26 [20-10-2016(online)].pdf 2016-10-20
5 3254-DEL-2015-PatentCertificate29-11-2018.pdf 2018-11-29
6 3254-DEL-2015-Power of Attorney-271016.pdf 2016-11-01
6 3254-DEL-2015-Annexure (Optional) [19-09-2018(online)].pdf 2018-09-19
7 3254-DEL-2015-Response to office action (Mandatory) [19-09-2018(online)].pdf 2018-09-19
7 3254-DEL-2015-Correspondence-271016.pdf 2016-11-01
8 Other Patent Document [10-02-2017(online)].pdf 2017-02-10
8 3254-DEL-2015-HearingNoticeLetter.pdf 2018-08-03
9 3254-DEL-2015-ABSTRACT [18-07-2018(online)].pdf 2018-07-18
9 Form 3 [15-02-2017(online)].pdf 2017-02-15
10 3254-DEL-2015-CLAIMS [18-07-2018(online)].pdf 2018-07-18
10 3254-DEL-2015-OTHERS-150217.pdf 2017-02-17
11 3254-DEL-2015-Correspondence-150217.pdf 2017-02-17
11 3254-DEL-2015-FER_SER_REPLY [18-07-2018(online)].pdf 2018-07-18
12 3254-DEL-2015-OTHERS [18-07-2018(online)].pdf 2018-07-18
12 OTHERS [15-05-2017(online)].pdf 2017-05-15
13 3254-DEL-2015-FER.pdf 2018-03-26
13 OTHERS [24-05-2017(online)].pdf 2017-05-24
14 3254-DEL-2015-FORM 3 [21-02-2018(online)].pdf 2018-02-21
14 EVIDENCE FOR SSI [24-05-2017(online)].pdf 2017-05-24
15 3254-DEL-2015-FORM 3 [09-08-2017(online)].pdf 2017-08-09
15 Form 18 [31-05-2017(online)].pdf 2017-05-31
16 3254-DEL-2015-FORM 3 [09-08-2017(online)].pdf 2017-08-09
16 Form 18 [31-05-2017(online)].pdf 2017-05-31
17 EVIDENCE FOR SSI [24-05-2017(online)].pdf 2017-05-24
17 3254-DEL-2015-FORM 3 [21-02-2018(online)].pdf 2018-02-21
18 3254-DEL-2015-FER.pdf 2018-03-26
18 OTHERS [24-05-2017(online)].pdf 2017-05-24
19 3254-DEL-2015-OTHERS [18-07-2018(online)].pdf 2018-07-18
19 OTHERS [15-05-2017(online)].pdf 2017-05-15
20 3254-DEL-2015-Correspondence-150217.pdf 2017-02-17
20 3254-DEL-2015-FER_SER_REPLY [18-07-2018(online)].pdf 2018-07-18
21 3254-DEL-2015-CLAIMS [18-07-2018(online)].pdf 2018-07-18
21 3254-DEL-2015-OTHERS-150217.pdf 2017-02-17
22 3254-DEL-2015-ABSTRACT [18-07-2018(online)].pdf 2018-07-18
22 Form 3 [15-02-2017(online)].pdf 2017-02-15
23 3254-DEL-2015-HearingNoticeLetter.pdf 2018-08-03
23 Other Patent Document [10-02-2017(online)].pdf 2017-02-10
24 3254-DEL-2015-Response to office action (Mandatory) [19-09-2018(online)].pdf 2018-09-19
24 3254-DEL-2015-Correspondence-271016.pdf 2016-11-01
25 3254-DEL-2015-Power of Attorney-271016.pdf 2016-11-01
25 3254-DEL-2015-Annexure (Optional) [19-09-2018(online)].pdf 2018-09-19
26 Form 26 [20-10-2016(online)].pdf 2016-10-20
26 3254-DEL-2015-PatentCertificate29-11-2018.pdf 2018-11-29
27 Other Patent Document [20-10-2016(online)].pdf 2016-10-20
27 3254-DEL-2015-IntimationOfGrant29-11-2018.pdf 2018-11-29
28 OnlinePostDating.pdf 2016-10-14
28 3254-DEL-2015-RELEVANT DOCUMENTS [26-03-2019(online)].pdf 2019-03-26
29 Description(Provisional) [09-10-2015(online)].pdf 2015-10-09
29 3254-DEL-2015-RELEVANT DOCUMENTS [24-03-2020(online)].pdf 2020-03-24
30 Form 5 [09-10-2015(online)].pdf 2015-10-09
30 3254-DEL-2015-RELEVANT DOCUMENTS [29-09-2021(online)].pdf 2021-09-29

Search Strategy

1 search11_17-11-2017.pdf

ERegister / Renewals

3rd: 04 Dec 2018

From 20/10/2017 - To 20/10/2018

4th: 04 Dec 2018

From 20/10/2018 - To 20/10/2019

5th: 25 Sep 2019

From 20/10/2019 - To 20/10/2020

6th: 06 Oct 2020

From 20/10/2020 - To 20/10/2021

7th: 19 Oct 2021

From 20/10/2021 - To 20/10/2022