Sign In to Follow Application
View All Documents & Correspondence

System And Method Of Digitizing Contents Dynamically Presented On A Non Descript Working Surface Augmented With User Action Based Control

Abstract: ABSTRACT SYSTEM AND METHOD OF DIGITIZING CONTENTS DYNAMICALLY PRESENTED ON A NON-DESCRIPT WORKING SURFACE AUGMENTED WITH USER ACTION-BASED CONTROL The present invention discloses a system and a method of digitizing contents dynamically presented on a non-descript working surface augmented with user action-based control. The system comprises an optical unit for capturing the contents written on a non-descript working surface, a processing unit for processing the input received from the optical unit comprising a pre-processing module, a colour Identification module, a filtering module, a masking module, and an image smoothing module, an output unit for storing the output received from the image smoothing module of the processing unit, and a control unit for controlling the processing unit and/or the operation of the connected device based on user actions. Reference Figure: Figure 1

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
30 September 2021
Publication Number
05/2025
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
Parent Application

Applicants

DEPASOLUTIONS PRIVATE LIMITED
No. 21, 1st B Main Road, Caveri Layout, Nagarbhavi Main Road, Vijayanagar, Bengaluru, Karnataka, Pincode – 560040, India

Inventors

1. Karthik Srikanth Joshi
#9, 2nd cross, Bilekahalli Layout, B.G.Road, Bengaluru, Karnataka, Pincode - 560076, India
2. Sateesh Shankar Kannegala
#102, Palm Court, Jakkur Plantation Road, Jakkur, Bengaluru, Karnataka, Pincode - 560064, India
3. Aniruddha Kannal
B003, Amrutha Sparkling Nest, Garudacharpalya, Bengaluru, Karnataka, Pincode - 560048, India

Specification

DESC:FORM 2
THE PATENTS ACT, 1970
(39 of 1970)
&
THE PATENTS RULES, 2003
COMPLETE SPECIFICATION
(See section 10 and rule 13)

SYSTEM AND METHOD OF DIGITIZING CONTENTS DYNAMICALLY PRESENTED ON A NON-DESCRIPT WORKING SURFACE AUGMENTED WITH USER ACTION-BASED CONTROL

DEPASOLUTIONS PRIVATE LIMITED, a company incorporated under the laws of India, of the address No. 21, 1st B Main Road, Caveri Layout, Nagarbhavi Main Road, Vijayanagar, Bengaluru, Karnataka, Pincode -560040, India.

The following specification particularly describes the invention and the manner in which it is to be performed.
FIELD OF THE INVENTION
The present invention relates to a system and method of digitizing contents dynamically presented on a non-descript working surface augmented with user action-based control.

BACKGROUND OF THE INVENTION
Learning in the present era has been highly impacted due to technology. Nowadays, various devices are used for displaying the contents written on a board to projector or internet. However, such devices have higher cost and/or often have a steep learning curve leading to low adoption rates.

Therefore, the object of the present invention is to solve one or more of aforementioned issues.

BRIEF DESCRIPTION OF DRAWINGS
Reference will be made to embodiments of the invention, example of which may be illustrated in the accompanying figure(s). These figure(s) are intended to be illustrative, not limiting. Although the invention is generally described in the context of these embodiments, it should be understood that it is not intended to limit the scope of the invention to these particular embodiments.
Figure 1 shows a system of digitizing contents dynamically presented on a non-descript working surface augmented with user action based control according to an embodiment of the present invention;

Figure 2 shows steps of working of a colour identification module of a processing unit according to an embodiment of the present invention;

Figure 3 shows steps of working of an image smoothing module of a processing unit according to an embodiment of the present invention; and

Figure 4 shows steps of working of a control unit according to an embodiment of the present invention.

DETAILED DESCRIPTION OF THE INVENTION
The present invention relates to a system and method of digitizing contents dynamically presented on a non-descript working surface augmented with user action-based control. In an embodiment, the present invention provides a system and a method to transform any non-descript surface on which one can write legibly, for example, a sheet of paper, a plastic board etc. to a smart surface.

The system and method uses optical unit such as a camera to capture contents written and/or being dynamically written on the working surface as input. A processing unit, processes the input received from the optical unit to produce the output stream. The output stream can be channelled according to the user’s requirement. For instance, the output stream can be stored in the internal memory of a device or unit, broadcast over the internet etc. The produced output along with other necessary display items may be projected using appropriate devices such as an image/video projector onto or in the periphery of the working surface. Additionally, a user can control the processing unit remotely, for example, using hand gestures. The Figure 1 discloses system of digitizing contents dynamically presented on a non-descript working surface augmented with user action based control.

The system and method of digitizing contents dynamically presented on a non-descript working surface augmented with user action-based control comprises:

1. An optical unit (100) for capturing the contents written on a non-descript working surface. The optical unit for instance may be a camera. Additionally, the optical unit may include peripheral components such as colour filters.

2. A processing unit (120) for processing the input received from the optical unit. The processing unit comprises:

a. A pre-processing module for removing background noise in the input stream. For example, the background noise could be spurious dark or bright pixels introduced due to noise in the hardware used to capture the input or due to ambient light fluctuations;

b. A colour identification module for identification of colours present in the input received from the pre-processing module. The user may use multiple colours to render contents on the working surface. All relevant contents possibly rendered in multiple colours have to be captured in order to produce a faithful output. These multi-coloured contents are captured using a two-pronged approach. Firstly, the contents are clustered using unsupervised clustering module (220) for clustering contents based on shape, colour or both. Secondly, the prominent colours present in the frame in focus of the input stream is inferred by obtaining the colour values present in the frame through colour value extraction module (230). The colour values could be HSV values, for instance. These results are then fed into an estimation module (240) for identifying colours based conditional probability which returns the set of colours most probably used in creating the content. The Figure 2 discloses the working of a colour identification module;

c. A filtering module for identification and elimination of contents present in colours not identified by the colour identification module. The filtering module receives the colours identified by the colour identification module as the input and retains the contents in the input colours only. Additionally, the filtering module is configured to eliminate background objects, such as the hand of the user, that are captured in the input stream;

d. A masking module for identification and elimination of extraneous content not filtered by the filtering module. The masking module acts as the last layer of filtering. It masks out the areas of the input that do not have the content of interest. This could be objects present in the ambience that was captured but not filtered out by the filtering module as it could be in the same colour as the colours identified by the colour identification module; and

e. An image smoothing module for smoothing rugged portions of the content or to fill in portions of the content that were filtered out in the previous steps such as a small missing part of a line segment. It uses a kernel (320) of a predetermined size, which is convolved with the part of the input of the same size as that of the kernel. The Figure 3 discloses working of an image smoothing module.

3. An output unit (130) for storing the output received from the image smoothing module of the processing unit. Additionally, the output unit may be used as input to plugins supporting other software, projected using an appropriate device etc.

4. A control unit (140) for controlling the processing unit and/or the operation of the connected device such as computer based on user actions. For instance, the output from the output unit when projected, can also be used to control the processing unit by capturing user actions such as hand gestures. These user inputs can be optically captured using the same or different optical unit(s) as the one used to capture contents put out on the working surface. A predetermined set of gestures are uniquely mapped to certain controls of the processing unit and/or the computer. These gestures can be determined using, for example, motion detection and path tracing method, through a motion detection module (420), and a path detection module (430), augmented with a classification method for classifying gesture, through a classification module (440), to infer the desired control. The Figure 4 discloses working of a control module.

A method of digitizing contents dynamically presented on a non-descript working surface augmented with user action-based control comprises:
a) capturing, by an optical unit, the contents written on a non-descript working surface;
b) processing, by a processing unit, the input received from the optical unit;
c) storing, by an output unit, the output received from an image smoothing module of the processing unit; and
d) controlling, by a control unit, the processing unit and/or the operation of the connected device such as computer based on user actions.

The step (b) of processing comprises:
a) removing, by a pre-processing module, background noise in the input stream;
b) identifying, by a colour Identification module, colours present in the input received from the pre-processing module;
c) identifying and eliminating, by a filtering module, contents present in colours not identified by the colour identification module;
d) identifying and eliminating, by a masking module, extraneous content not filtered by the filtering module; and
e) smoothing, by an image smoothing module, rugged portions of the content or to fill in portions of the content that were filtered out in the previous steps such as a small missing part of a line segment.

The foregoing description of the invention has been set merely to illustrate the invention and is not intended to be limiting. Since modifications of the disclosed embodiments incorporating the substance of the invention may occur to person skilled in the art, the invention should be construed to include everything within the scope of the disclosure.
,CLAIMS:We Claim:
1. A system of digitizing contents dynamically presented on a non-descript working surface augmented with user action-based control comprises:
an optical unit for capturing the contents written on a non-descript working surface;
a processing unit for processing the input received from the optical unit;
an output unit for storing the output received from an image smoothing module of the processing unit; and
a control unit for controlling the processing unit and/or the operation of the connected device such as computer, based on user actions.

2. The system of digitizing contents as claimed in Claim 1, wherein the optical unit include a camera, and a peripheral component.

3. The system of digitizing contents as claimed in Claim 1, wherein the output unit may be used as input to plugins supporting software, projected using an appropriate device.

4. The system of digitizing contents as claimed in Claim 1, wherein the processing unit comprises:

a pre-processing module for removing background noise in the input stream;
a colour identification module for identification of colours present in the input received from the pre-processing module;
a filtering module for identification and elimination of contents present in colours not identified by the colour identification module;
a masking module for identification and elimination of extraneous content not filtered by the filtering module; and
an image smoothing module for smoothing rugged portions of the content or to fill in portions of the content that were filtered out in the previous steps such as a small missing part of a line segment.

5. A method of digitizing contents dynamically presented on a non-descript working surface augmented with user action-based control comprises:
capturing, by an optical unit, the contents written on a non-descript working surface;
processing, by a processing unit, the input received from the optical unit;
storing, by an output unit, the output received from an image smoothing module of the processing unit; and
controlling, by a control unit, the processing unit and/or the operation of the connected device such as computer based on user actions.
6. The method of digitizing contents as claimed in Claim 5, wherein the step of processing comprises:
removing, by a pre-processing module, background noise in the input stream;
identifying, by a colour Identification module, colours present in the input received from the pre-processing module;
identifying and eliminating, by a filtering module, contents present in colours not identified by the colour identification module;
identifying and eliminating, by a masking module, extraneous content not filtered by the filtering module; and
smoothing, by an image smoothing module, rugged portions of the content or to fill in portions of the content that were filtered out in the previous steps such as a small missing part of a line segment.

Documents

Application Documents

# Name Date
1 202141044852-PROVISIONAL SPECIFICATION [30-09-2021(online)].pdf 2021-09-30
2 202141044852-FORM FOR STARTUP [30-09-2021(online)].pdf 2021-09-30
3 202141044852-FORM FOR SMALL ENTITY(FORM-28) [30-09-2021(online)].pdf 2021-09-30
4 202141044852-FORM 1 [30-09-2021(online)].pdf 2021-09-30
5 202141044852-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [30-09-2021(online)].pdf 2021-09-30
6 202141044852-EVIDENCE FOR REGISTRATION UNDER SSI [30-09-2021(online)].pdf 2021-09-30
7 202141044852-DRAWINGS [30-09-2021(online)].pdf 2021-09-30
8 202141044852-DRAWING [30-09-2022(online)].pdf 2022-09-30
9 202141044852-COMPLETE SPECIFICATION [30-09-2022(online)].pdf 2022-09-30
10 202141044852-FORM-26 [24-01-2024(online)].pdf 2024-01-24
11 202141044852-FORM 18 [06-02-2025(online)].pdf 2025-02-06