Sign In to Follow Application
View All Documents & Correspondence

An Automated Indexing Of Visual Data

Abstract: A method for automatic indexing of a visual data is provided. The method includes splitting the visual data into two distinct streams of the data; indexing each of the streams based a parameter; establishing a correlation between each of the indexed streams; and displaying a correlated indexed integrated stream. A system for automatic indexing of visual data is also provided.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
15 February 2013
Publication Number
19/2013
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
INFO@IPCOPIA.COM
Parent Application
Patent Number
Legal Status
Grant Date
2023-12-19
Renewal Date

Applicants

KPOINT TECHNOLOGIES
8TH FLOOR, AMAR ARMA GENESIS,BANER ROAD,BANER PUNE – 411045, MAHARASTRA, INDIA

Inventors

1. ATUL NARKHEDE
31, ALANKAR SOCIETY, KARVENAGAR , PUNE – 411052, MAHARASTRA, INDIA
2. AVIJIT SEN MAJUMDER
A 10/8,COMFORT ZONE, BALEWADI, PUNE – 411045, INDIA

Specification

CLIAMS:

1. A method for automatic indexing of visual data, the method comprising:
splitting the visual data into at least two distinct streams of the data;
indexing each of the streams based on at least one parameter;
establishing a correlation between each of the indexed streams; and
displaying a correlated indexed integrated stream.
2. The method according to claim 1, wherein the first stream of data can be an encoder stream wherein the encoder stream comprises of continuous screen updates of the visual data measured at a predefined polling interval.
3. The method according to claim 1, wherein the second stream of data can be an index stream wherein the index stream comprises of a series of tuples.
4. The method according to claim 1, wherein each tuple of the series comprises of at least two components wherein the first component can be a temporal text and the second component can be a visual representation.
5. The method according to claim 1, wherein the integrated stream for display is constructed from the time ordered list of the indexed series of tuples.

6. A system for automatic indexing of visual data, the system comprising of
a screen capture module;
an indexer coupled to the screen capture module;
a recorder coupled to the indexer; and
a player for displaying the recorded indexed data.
7. The system according to claim 6, wherein the screen capture module comprises of
a frame-grab unit for capturing the visual data as a series of time stamped image frames;
an encoder for compressing the time stamped frames of the visual data; and
an adaptor for automatically generating temporal text index from the encoded visual data.
8. A method for searching an indexed visual data wherein the method comprises of:
receiving a query from a user wherein the query comprises of at least one keyword;
examining the occurrence of the keyword in the indexed visual data to obtain a pattern match; and
displaying at least one instance of the keyword in the pattern match ;
9. The method of claim 8, wherein the keyword can be an alpha numeric text string ;

10. The method of claim 8, wherein the displaying of the occurrence is a temporal display.
,TagSPECI:AN AUTOMATED INDEXING OF VISUAL DATA

FIELD OF INVENTION
The invention generally relates to the field of data analytics and particularly to a system and method for automated indexing of visual data.

BACKGROUND
A textual based search is mostly dependent on keywords and is performed commonly across many search engines. Video data are annotated for selecting a particular instance in the video. However, digital content also comprises of visual data which includes but is not limited to textual data, pictorial data, stream and video stream, referred to herein after as rich content. When rich content is recorded and shared over the internet, the rich content is normally stored as a video. Although videos are annotated for easy scrolling of the video, there are no methods available to enable a searchable rich content. Further, annotations on video shared, for example, manually inserted tags associated with annotations on visual data sharing platforms, including but not limited to, flickr, slideshare and YouTube are merely descriptive keywords for the entire video. One of the disadvantages of the annotations provided by the aforementioned portal is that the annotations are mere indicators of the title of the visual data and not markers to precise locations within the visual data.

BRIEF DESCRIPTION OF DRAWINGS:
So that the manner in which the recited features of the invention can be understood in detail, some of the embodiments are illustrated in the appended drawings. It is to be noted, however, that the appended drawings illustrate only typical embodiments of this invention and are therefore not to be considered limiting of its scope, for the invention may admit to other equally effective embodiments.
FIG.1 illustrates a schematic representation of indexed output of a visual data according to an embodiment of the invention.
FIG. 2 illustrates a flow chart of an indexing routine of the adaptor, according to an embodiment of the invention.
FIG.3 illustrates a screen capture module of an automated indexing system, according to an embodiment of the invention.
FIG. 4 illustrates various components of an application specific adaptor, according to an embodiment of the invention.
FIG.5 illustrates a search routine of the text index of a recording, according to an embodiment of the invention.
FIG.6 illustrates a viewer session, according to an embodiment of the invention.
FIG.7 illustrates a flow chart of web page indexing routine, according to an embodiment of the invention.

SUMMARY
One aspect of the invention includes a method for automatic indexing of a visual data. The method includes splitting the visual data into two distinct streams of the data; indexing each of the streams based a parameter; establishing a correlation between each of the indexed streams; and displaying a correlated indexed integrated stream.
Another aspect of the invention provides a system for automatic indexing of a visual data. The system includes a screen capture module; an indexer coupled to the screen capture module; a recorder coupled to the indexer and a player for displaying the recorded indexed data.

DETAILED DESCRIPTION
Various embodiments of the invention provide an automated indexing of visual data. FIG.1 illustrates a schematic representation of indexed output of a visual data according to an embodiment of the invention.
One embodiment of the invention provides a method for automatic indexing of the visual data 101. The method includes splitting the visual data into two distinct streams of the data 103 and 105; indexing each of the streams based a parameter; establishing a correlation between each of the indexed streams; and displaying a correlated indexed integrated stream 107. Initially, the visual data 101 is split into at least two distinct streams of data 103 and 105 during a screen capture session. The screen capture session originates from an endpoint. The endpoint can be a recording session occurring at any given location. The recorded session is then retrievably uploaded onto a server, for access by the initiated screen capture session. The two distinct streams of split data are an encoder stream 103 and an index stream 105. The encoder stream 103 is a function of current clock time (t) and represents continuous screen updates at a predefined polling interval. The encoder stream 103 uses compression mechanism, known to a person skilled in the art, to enable quick upload and delivery of the stream over public internet. The encoder stream 103 is retrievably saved as a media file by the recording session running at the server. The method of obtaining the index stream 105 is described herein below in detail.
FIG.2 illustrates a flow chart of an indexing routine of the adaptor, according to an embodiment of the invention. Upon initialization of the indexing routine 201, a change 203 in the presentation of the visual data is detected. The change can be a scene change which includes but is not limited to change of slide of a power point presentation, writings on a suitable medium, display of an image and the like. The detected screen change is then analyzed to estimate the extent of screen change 205. If the screen change is significant, then the stream is indexed 207. In one example of the invention the screen change is determined through a delta function. The value of the delta function is dependent on the nature of the visual stream being indexed. If the visual stream is a power point presentation, then the delta function is set between limits 0 and 1, wherein 0 indicates no change in slide and 1 indicates a change to the next slide in the presentation.
The index stream carries a series of tuples. Each tuple comprises of at least two components, namely a temporal text and a visual representation inside the recorded media. An index tuple is defined as a function. Tuple= fn(T, TXT, IMG), 213 wherein T is a discreet clock time associated with a significant index worthy screen change, TXT 209 is a relevant text extracted out of the screen elements and IMG is a pictorial view of the captured screen. The index stream is retrievably saved inside a structured text index, by the recording session, at the server. The IMG 211 is derived by initiating the encoder to take an image snapshot of encoded screen at the time T 215. At the end of indexing, a correlated structured index stream is obtained. The correlated structured index stream is then adapted for display. During display, a correlated indexed integrated stream is constructed from the time ordered list of index tuples (T, TXT, IMG). Each of the IMG from the indexed tuples is fetched asynchronously by a player on the basis of indexed stream.
The invention also provides a system for automatic indexing of visual data. The system includes a screen capture module; an indexer coupled to the screen capture module; a recorder coupled to the indexer and a player for displaying the recorded indexed data.
FIG.3 illustrates a screen capture module of an automated indexing system, according to an embodiment of the invention. The screen capture module 300 is configured for media capture. The screen capture module 300 is also capable of automatically generating a text index of the captured media. The screen capture module runs locally at the endpoints from where the screen needs to be captured. The screen capture module connects to a recording server available on public cloud via internet connection. The screen capture module consists of three major functional blocks namely a frame-grab 301, an encoder 303 and an adaptor 305. The frame-grab component 301 is responsible for capturing screen frame as raw image at certain predefined interval and then forwarding them to ‘encoder’ 303 for compression. The encoder 303 calculates the difference between two consecutive images in terms of rectangle blocks and then compresses them with various encoding mechanisms, known to a person skilled in the art. The resulting output is carried over by the encoder stream 307. The adaptor 305 plays most critical role for auto generating temporal text index 309. Adaptors are classified broadly into at least two categories, namely, an application specific adaptor and a system specific adaptor.
The application specific adaptor latch on to specific application which includes but is not limited to a web browser, Microsoft power point presentation and observe the changes happening in the application context during screen capture. Each of these adaptors applies a set of heuristics to determine a significant index worthy transition. The working of the application specific adaptor shall be explained in detail herein below. It also runs application specific connector to extract meaningful text associated with the screen transition. The system specific adaptor is specific for each target device on which screen capture happens. Like application specific adaptor, it observes changes happening on a device screen from system level perspective and applies heuristics to find index worthy transitions.
FIG. 4 illustrates various components of an application specific adaptor, according to an embodiment of the invention. The app connector component 401 is responsible for attaching to the parent application. Examples of parent application include but are not limited to browsers, operating system specific applications such as Microsoft office, star office, Java applications and all such applications wherein the adapter linkage could be established. In one example parent application can be an add-on, which include but are not limited to browser add-ons and Microsoft office add-on. The app connector 401 is configured for tracking important events originated by the application due to usage. The app connector is also configured for capturing text changes associated with the parent application. The event watcher 403 analyzes the importance of events from the screen change perspective for example change of web page, transition to next slide. Once selected it notifies text analyzer 405 with the text captured at the app connector. The text analyzer 405 parses and cleans the text dump, then applies heuristics to generate relevant text 402 as ‘TXT’ for index tuple. The adaptor signals the encoder 407 to take image snapshot of the encoded screen to generate IMG 404. Finally each tuple gets completed by the current clock time 406 T. This framework can be easily extended for various application and devices to increase the quality of the index and hence the search result.

FIG.5 illustrates a search routine of the text index of a recording, according to an embodiment of the invention. During the search routine a user provides a query 501 to retrieve a particular instance of a recording. In one example of the invention, the query comprises of at least one keyword . The keyword can be an alpha numeric string with a minimum of one character. A suggestor 503 is configured for displaying all occurrences of the first typed characters of the keyword. The related occurrences are fetched form the text index stored in a database 505. Upon selection of a particular keyword 502 by the user, a pattern recognition unit 507 examines the occurrence of the keyword in the database 505 to obtain at least one pattern match 504. The pattern recognition unit 507 displays at least one instance of the keyword in the pattern match as a temporally arranged search result 509. The user then selects a particular temporal search result for viewing. The search result quality is dependent on the quality of index both in temporal and textual sense.
In an alternate embodiment of the invention, the search routine can be initiated over an internet. In the context of internet search, each index tuple can result in temporal url leading to a precise point inside the recording instead of pointing towards the entire recording as a black box. These urls can be easily shared and embedded to promote a screen sharing recording in the internet as effectively as a textual web page.

FIG.6 illustrates a viewer session, according to an embodiment of the invention. A screen player 601 is configured to play a media recording obtained from the encoded stream 602. Further the player is coupled with an enhanced visual navigation strip 603. The visual strip 603 is derived from the index stream 604 of the recording. Each entry in the strip 603 corresponds to an index tuple with time, text and image snapshot. This improves both navigation and readability. It is also possible to search for any text string available within the text index of the recording. Search results typically highlight visual strip entries with match for easier navigation. In another embodiment of the invention, a user is allowed to click on the visually available tuple element displayed on a screen. Upon selecting a particular tuple element, the player seeks to the corresponding time on encoded stream to facilitate easy navigation.

FIG.7 illustrates a flow chart of web page indexing routine, according to an embodiment of the invention. The webpage indexer is the most effective adaptor built on the basis of the adaptor framework. Most of the software applications are increasingly becoming web browser based. Similarly most of the information is consumed, created and shared via web pages. So when one happens to record a screen, it’s very likely that the target screen would be a web page viewed through a web browser. The webpage indexer is a multi platform entity, which acts as an add-on to the generally available web browsers 701. The web browser includes but is not limited to chrome, Firefox, internet explorer and safari. It then observes every page changes happening within the browser control 703 and then derives important text representation using standard web crawling methodologies adopted by web search engines. As a result, the tuple index produced by webpage indexer adaptor is of very high quality in terms of relevance. As illustrated, a domain specific handler is provided for determining the nature of the page loaded 705. If the page is a generic page, then the page is sent for determining initiation of indexing 713. If the page loaded is a domain specific page, then the domain specific handler transforms the data for indexing 707 based on significant index worthy event 709. Upon initiation of the indexing routine, the data displayed is indexed 711 according to the method described herein before.
The invention as described herein provides a method for automated indexing of visual data. One of the advantages of the method as described herein includes enable automatic searching of a recording. Another advantage is building a visual index on the recording for quick scanning and random access. The indexing process is completely independent of any manual intervention. Incorporation of application-specific adaptors for the most commonly used applications (ex. browser, and Microsoft PowerPoint presentations) to generate a good and precise index for search and visual navigation. The method and system as described herein has many applications. An example of such application includes but is not limited to brand promotion; product documentation; training; customer support; and building a rich multimedia knowledge-base library.

The foregoing description of the invention has been set for merely to illustrate the invention and is not intended to be limiting. Since modifications of the disclosed embodiments incorporating the spirit and substance of the invention may occur to person skilled in the art, the invention should be construed to include everything within the scope of the appended claims and equivalents thereof.

Documents

Application Documents

# Name Date
1 451-MUM-2013-FORM 13 [14-02-2025(online)]-1.pdf 2025-02-14
1 poa_kpoint.pdf 2018-08-11
2 451-MUM-2013-FORM 13 [14-02-2025(online)].pdf 2025-02-14
2 form5.pdf 2018-08-11
3 form3.pdf 2018-08-11
3 451-MUM-2013-FORM-26 [14-02-2025(online)].pdf 2025-02-14
4 drawings_FINAL.pdf 2018-08-11
4 451-MUM-2013-POA [14-02-2025(online)]-1.pdf 2025-02-14
5 Comp_spec_kpoint_revised_final.pdf 2018-08-11
5 451-MUM-2013-POA [14-02-2025(online)].pdf 2025-02-14
6 ABSTRACT1.jpg 2018-08-11
6 451-MUM-2013-EVIDENCE FOR REGISTRATION UNDER SSI [16-01-2024(online)].pdf 2024-01-16
7 451-MUM-2013-FORM FOR SMALL ENTITY [16-01-2024(online)].pdf 2024-01-16
7 451-MUM-2013-FORM 5(3-4-2013).pdf 2018-08-11
8 451-MUM-2013-IntimationOfGrant19-12-2023.pdf 2023-12-19
8 451-MUM-2013-FORM 3(3-4-2013).pdf 2018-08-11
9 451-MUM-2013-FORM 26(3-4-2013).pdf 2018-08-11
9 451-MUM-2013-PatentCertificate19-12-2023.pdf 2023-12-19
10 451-MUM-2013-FORM 18.pdf 2018-08-11
10 451-MUM-2013-US(14)-ExtendedHearingNotice-(HearingDate-13-05-2021).pdf 2021-10-03
11 451-MUM-2013-FORM 1(3-4-2013).pdf 2018-08-11
11 451-MUM-2013-US(14)-HearingNotice-(HearingDate-30-04-2021).pdf 2021-10-03
12 451-MUM-2013-2. Marked Copy under Rule 14(2) [27-05-2021(online)].pdf 2021-05-27
12 451-MUM-2013-CORRESPONDENCE(3-4-2013).pdf 2018-08-11
13 451-MUM-2013-FER.pdf 2018-11-19
13 451-MUM-2013-Retyped Pages under Rule 14(1) [27-05-2021(online)].pdf 2021-05-27
14 451-MUM-2013-Retyped Pages under Rule 14(1) (MANDATORY) [20-05-2019(online)].pdf 2019-05-20
14 451-MUM-2013-Written submissions and relevant documents [27-05-2021(online)].pdf 2021-05-27
15 451-MUM-2013-Correspondence to notify the Controller [10-05-2021(online)].pdf 2021-05-10
15 451-MUM-2013-OTHERS [20-05-2019(online)].pdf 2019-05-20
16 451-MUM-2013-Correspondence to notify the Controller [29-04-2021(online)].pdf 2021-04-29
16 451-MUM-2013-FER_SER_REPLY [20-05-2019(online)].pdf 2019-05-20
17 451-MUM-2013-FORM-26 [29-04-2021(online)].pdf 2021-04-29
17 451-MUM-2013-DRAWING [20-05-2019(online)].pdf 2019-05-20
18 451-MUM-2013-2. Marked Copy under Rule 14(2) (MANDATORY) [20-05-2019(online)].pdf 2019-05-20
18 451-MUM-2013-COMPLETE SPECIFICATION [20-05-2019(online)].pdf 2019-05-20
19 451-MUM-2013-2. Marked Copy under Rule 14(2) (MANDATORY) [20-05-2019(online)].pdf 2019-05-20
19 451-MUM-2013-COMPLETE SPECIFICATION [20-05-2019(online)].pdf 2019-05-20
20 451-MUM-2013-DRAWING [20-05-2019(online)].pdf 2019-05-20
20 451-MUM-2013-FORM-26 [29-04-2021(online)].pdf 2021-04-29
21 451-MUM-2013-Correspondence to notify the Controller [29-04-2021(online)].pdf 2021-04-29
21 451-MUM-2013-FER_SER_REPLY [20-05-2019(online)].pdf 2019-05-20
22 451-MUM-2013-Correspondence to notify the Controller [10-05-2021(online)].pdf 2021-05-10
22 451-MUM-2013-OTHERS [20-05-2019(online)].pdf 2019-05-20
23 451-MUM-2013-Written submissions and relevant documents [27-05-2021(online)].pdf 2021-05-27
23 451-MUM-2013-Retyped Pages under Rule 14(1) (MANDATORY) [20-05-2019(online)].pdf 2019-05-20
24 451-MUM-2013-FER.pdf 2018-11-19
24 451-MUM-2013-Retyped Pages under Rule 14(1) [27-05-2021(online)].pdf 2021-05-27
25 451-MUM-2013-2. Marked Copy under Rule 14(2) [27-05-2021(online)].pdf 2021-05-27
25 451-MUM-2013-CORRESPONDENCE(3-4-2013).pdf 2018-08-11
26 451-MUM-2013-FORM 1(3-4-2013).pdf 2018-08-11
26 451-MUM-2013-US(14)-HearingNotice-(HearingDate-30-04-2021).pdf 2021-10-03
27 451-MUM-2013-FORM 18.pdf 2018-08-11
27 451-MUM-2013-US(14)-ExtendedHearingNotice-(HearingDate-13-05-2021).pdf 2021-10-03
28 451-MUM-2013-FORM 26(3-4-2013).pdf 2018-08-11
28 451-MUM-2013-PatentCertificate19-12-2023.pdf 2023-12-19
29 451-MUM-2013-FORM 3(3-4-2013).pdf 2018-08-11
29 451-MUM-2013-IntimationOfGrant19-12-2023.pdf 2023-12-19
30 451-MUM-2013-FORM FOR SMALL ENTITY [16-01-2024(online)].pdf 2024-01-16
30 451-MUM-2013-FORM 5(3-4-2013).pdf 2018-08-11
31 ABSTRACT1.jpg 2018-08-11
31 451-MUM-2013-EVIDENCE FOR REGISTRATION UNDER SSI [16-01-2024(online)].pdf 2024-01-16
32 Comp_spec_kpoint_revised_final.pdf 2018-08-11
32 451-MUM-2013-POA [14-02-2025(online)].pdf 2025-02-14
33 drawings_FINAL.pdf 2018-08-11
33 451-MUM-2013-POA [14-02-2025(online)]-1.pdf 2025-02-14
34 form3.pdf 2018-08-11
34 451-MUM-2013-FORM-26 [14-02-2025(online)].pdf 2025-02-14
35 form5.pdf 2018-08-11
35 451-MUM-2013-FORM 13 [14-02-2025(online)].pdf 2025-02-14
36 451-MUM-2013-FORM 13 [14-02-2025(online)]-1.pdf 2025-02-14
36 poa_kpoint.pdf 2018-08-11

Search Strategy

1 searchstrategy_16-11-2018.pdf

ERegister / Renewals

3rd: 16 Jan 2024

From 15/02/2015 - To 15/02/2016

4th: 16 Jan 2024

From 15/02/2016 - To 15/02/2017

5th: 16 Jan 2024

From 15/02/2017 - To 15/02/2018

6th: 16 Jan 2024

From 15/02/2018 - To 15/02/2019

7th: 16 Jan 2024

From 15/02/2019 - To 15/02/2020

8th: 16 Jan 2024

From 15/02/2020 - To 15/02/2021

9th: 16 Jan 2024

From 15/02/2021 - To 15/02/2022

10th: 16 Jan 2024

From 15/02/2022 - To 15/02/2023

11th: 16 Jan 2024

From 15/02/2023 - To 15/02/2024

12th: 16 Jan 2024

From 15/02/2024 - To 15/02/2025

13th: 14 Feb 2025

From 15/02/2025 - To 15/02/2026

14th: 14 Feb 2025

From 15/02/2026 - To 15/02/2027

15th: 14 Feb 2025

From 15/02/2027 - To 15/02/2028

16th: 14 Feb 2025

From 15/02/2028 - To 15/02/2029

17th: 14 Feb 2025

From 15/02/2029 - To 15/02/2030