Sign In to Follow Application
View All Documents & Correspondence

A Method For Generating Event Evidencing Video Clip For Events Occurred In Closely Spaced Timeline And A System Thereof

Abstract: ABSTRACT Title: A METHOD FOR GENERATING EVENT EVIDENCING VIDEO CLIP FOR EVENTS OCCURRED IN CLOSELY SPACED TIMELINE AND A SYSTEM THEREOF. The present invention discloses an innovative method for generating a single event clip against multiple events where the event clip size and duration in timeline is dynamically controlled to generate optimal number of event clips and generating and embedding metadata in the event clip itself to tag the event instants. The present method involves clip generator cooperative to an imaging camera for generating a video clip, particularly an event clip from a stream of video frames, navigator for easy navigation across video frames in a file based on event metadata stored in the same file, or for fast forward and reverse-fast-forward and steaming device cooperative to the navigator enabling the user to download only the required video frames instead of the complete video files for further processing, viz, rendering the video, or transcoding into other formats on demand.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
26 March 2018
Publication Number
39/2019
Publication Type
INA
Invention Field
ELECTRONICS
Status
Email
anjanonline@vsnl.net
Parent Application
Patent Number
Legal Status
Grant Date
2025-04-07
Renewal Date

Applicants

VIDEONETICS TECHNOLOGY PRIVATE LIMITED
PLOT-5, BLOCK-BP, SALT LAKE, KOLKATA WEST BENGAL INDIA 700091

Inventors

1. BOSE, Tuhin
FD 455/5, Sector III, Salt Lake City, Kolkata West Bengal India 700106
2. KUMAR, Dhiraj
Ward No 14, Raxa Rahimpur, Dhaka, East Champaran, Bihar India 845418

Specification

DESC:FIELD OF THE INVENTION:
The present invention relates to generation and distribution of video evidence against any events of interest. More specifically, the present invention is directed to develop an event evidencing video clip generating method for multiple events occurred in closely spaced timeline without any redundant video data.

BACKGROUND OF THE INVENTION:
Presently with the development of different video surveillance devices, generation and distribution of evidence video against any event of interest has became a regular task, particularly but not limited to the domain of Video analytics applications.
Traditionally a video clip of predefined duration is generated against each event. Each such event clip is of certain pre-defined duration, say 10 seconds. However, when the frequency of events is very high, say one event per second, then multiple such 10-second event clips will contain video of the same time. The situation is explained in the accompanying figure 1 which shows how occurrence of events in quick succession generates redundant video data in storage.
As evident from the figure 1, the video from time segments (T3 to T2) and (T5 to T4) are stored more than once and thus wasting the storage space. Though the three consecutive events have occurred within a time span of 4 seconds, three independent video clips will be stored in storage device in traditional event clip generation approach and there will be three records in Database to index the storage paths.
Thus there has been a need for developing a new method for generating event evidencing video clip which could generates optimized event clips for multiple events occurred in closely spaced timeline without the redundant video data.

OBJECT OF THE INVENTION:
It is thus the basic object of the present invention is to develop a method for generating event evidencing video clips against multiple events occurred in closely spaced timeline avoiding generation of the redundant video data.
Another object of the present invention is to develop a method for generating event evidencing video clips against multiple events occurred in closely spaced timeline which would be adapted to generate optimal number of event clips and generating and embedding metadata in the event clip itself to tag the event instants avoiding generation of the redundant video data.
Yet another object of the present invention is to develop a method for generating event evidencing video clips against multiple events occurred in closely spaced timeline which would be adapted to facilitate easy navigation across video frames corresponding to the event clips based on event metadata, or for fast forward and reverse-fast-forward.
A still further object of the present invention develop method for generating event evidencing video clips against multiple events occurred in closely spaced timeline which would be adapted to generate the event clip in a preferred file format for quick search of video segments in the file corresponding to various Event types and suitable for post processing the event clips.
Yet another object of the present invention is to develop a system for generating event evidencing video clips against multiple events occurred in closely spaced timeline avoiding generation of the redundant video data.

SUMMARY OF THE INVENTION:
Thus according to the basic aspect of the present invention there is provided a method for generating event evidencing video clips against multiple events occurred in closely spaced timeline comprising
generating at least a video clip of certain time duration from a stream of video frames covering multiple events captured in the video stream that are closely spaced in timeline;
storing the video clip in a memory buffer under an event clip file with event metadata for all the events covered by the video clip containing event timestamps and event type information;
clip generator monitoring the memory buffer continuously for arrival of new event metadata in the event clip file whereby the event clip file is closed if event is not generated beyond pre-fixed time limit or the file-size goes beyond a pre-fixed size in bytes;
tagging all the events in the event clip file occurred within the timestamp of first frame and that of last frame in the event clip file as a form of header information in the event clip file against each GOP;
storing the tagged event clip file in a permanent storage element with a database record with searching attributes for later searching of the file by a navigator based on a request for relevant video against any particular event within a given span of time and to identify the file among multiple of such event clip files stored in the permanent storage that includes the time requested.

In the present method, the searching attributes for later searching of the event clip file includes the first frame start time and the last frame start time along with the file path in the permanent storage.

In the present method, the navigator after getting the event clip file path from the database reads the header of each GOP of the event clip file;
wherein each of the GOP contains byte offset of next GOP in the event clip file to enable quick navigation across successive GOPs; and
wherein each of the GOP header contains the event metadata to enable the navigator identify whether the GOP contains the requested type of events or not and to decide whether to skip the GOP or stream it.

In the present method, a group header is constructed by involving the event types of the events that have occurred during the time interval determined by the first GOP to the last GOP of the event clip file.

In the present method, the group header is tagged at the beginning of the event clip file and once the navigator gets the file path from the database, the group header of the event clip file is read and if the requested event is not included in the group header, rest of the event clip file contents are discarded from reading.

In the present method, the video clip covering the events is generated by a clip generator cooperative to an imaging camera;
wherein the event metadata from one or more external image content analyzing system cooperative to the imaging camera is received in the memory buffer for generating the video clip as and when the events occur in real time and the same is detected by the content analyzing system, whereby the clip generator store the event metadata in an event queue and the event clip generator reads the video data stream and constructs the GOP of the video clip.

In the present method, construction of the GOP includes the event metadata from the event queue which is embedded in the header of the event clip and the event clip stored in the permanent storage for further retrieval.

The present method includes a steaming device cooperative to the navigator enabling user to download only the required video frames instead of the complete video files for further processing, viz, rendering the video, or transcoding into other formats on demand.

According to another aspect in the present invention there is provided a system for generating event evidencing video clips against multiple events occurred in closely spaced timeline involving the above method comprising
the clip generator for generating at least a video clip of certain time duration from a stream of video frames covering multiple events captured in the video stream that are closely spaced in timeline including event metadata for all the events covered by the video clip containing event timestamps and event type information;
the memory buffer to store the video under an event clip file with the event metadata for all the events covered by the video clip;
said clip generator monitors the memory buffer continuously for arrival of new event metadata in the event clip file whereby the event clip file is closed if event is not generated beyond pre-fixed time limit or the file-size goes beyond a pre-fixed size in bytes and thereby tags all the events in the event clip file occurred within the timestamp of first frame and that of last frame in the event clip file as a form of header information in the event clip file against each GOP;
the permanent storage element to store the tagged event clip file with the database record having searching attributes for later searching of the file by the navigator based on a request for relevant video against any particular event within a given span of time and to identify the file among multiple of such event clip files stored in the permanent storage that includes the time requested.

BRIEF DESCRIPTION OF THE ACCOMPANYING DRAWINGS:
Figure 1 shows how occurrence of events in quick succession generates redundant video data in storage.
Figure 2 shows process flowchart to generate event clips in accordance with the present invention.
Figure 3 shows the flowchart of generating event clips for frequently occurred events without generating the redundant video data in accordance with the present invention.

DESCRIPTION OF THE INVENTION WITH REFERENCE TO THE ACCOMPANYING DRAWINGS:
The present invention discloses an innovative method for generating a single event clip against multiple events where the event clip size and duration in timeline is dynamically controlled to generate optimal number of event clips and generating and embedding metadata in the event clip itself to tag the event instants.
An efficient way of streaming the video against any particular event from proper timestamp is also implemented by the method of the present invention. The present method involves clip generator cooperative to an imaging camera for generating a video clip, particularly an event clip from a stream of video frames, navigator for easy navigation across video frames in a file based on event metadata stored in the same file, or for fast forward and reverse-fast-forward and steaming device cooperative to the navigator enabling the user to download only the required video frames instead of the complete video files for further processing, viz, rendering the video, or transcoding into other formats on demand.
Event clip generation mechanism:

The clip generator generates video clip which is basically the event clip against detection of any activity of interest (event) in a video stream. Traditionally, against detection of each such event, a video clip of certain duration is generated. However, quick occurrence of events results in redundant data in the storage, as explained in figure 1 above.
A new method of generating a single Event clip for multiple events that are closely spaced in timeline is proposed. A memory buffer receives the event metadata from one or more external image content analyzing system cooperative to the imaging camera for generating a video clip as and when the events occur in real time and the same is detected by the content analyzing system. The Event metadata includes Event Timestamps and Event type. The Event clip generator continuously monitors the buffer for arrival of new Event metadata. If Event is not generated beyond certain time limit, the current Event clip file is closed. The current Event clip is also closed if the file-size goes beyond a pre-fixed size in bytes. All the Events that has occurred during this time (that is, within the timestamp of first frame and that of the last frame in the file), are tagged as a form of Header information in the file (against each Group of Pictures or GOP) before it is stored in a permanent storage element.
A database record is also created for later searching of the file with at least following attributes per record.

An exemplary frame data format shown hereunder in Table 1
Size in bytes Parameter Description
4 Magic String(‘00dc’) Starting of a frame
4 Media Type(MJPG-0,MPEG-1,H264-2,PCMU-3,PCMA-4,L16-5)
4 Frame Type (0=KEY FRAME, 1= NON_KEY FRAME)
8 Timestamp Frame Time
4 Frame Length Raw frame byte array size
Frame payload with standard Headers Raw frame byte array with Media header
8 Frame Start Offset Starting pointer of the next frame in file

Table 1
The following Table 2 shows the structure of Event clip. The Event clip consists of multiple Packets; each packet in turn contains Media frame and event metadata.

Size in bytes Parameter Description Packets
GOP byte Offset Offset in to the next GOP in this file 1st Packet (Frame Data and Metadata)
N x 8 Event Timestamps and Event Types Timestamps of frames where Event has been detected, in byte-offsets.
Media Frame data As described in Table 1 above
Media Frame data As described in Table 1 above
.
.

Media Frame data As described in Table 1 above
GOP byte Offset Offset in bytes from this GOP to the next GOP 2nd Packet (Frame Data and Metadata)
N x 8 Event Timestamps and Event Types Timestamps of frames where Event has been detected.
Media Frame data As described above
Media Frame data As described above
.
.

Media Frame data As described above
…………………………………………………………………………………………….

Streaming of Video on demand against any given Event:
Whenever request for relevant video against any particular event within a given span of time arrives, the navigator means consults the Database to identify the files that includes the time requested. Once getting the file path from the database records, the navigator goes on reading the Header of each GOP of the file. As Each GOP contains byte offset of next GOP in the file, very quick navigation across successive GOPs can be done. Each GOP header contains Event metadata, and hence the navigator knows whether the GOP contains the requested type of Events or not, to decide whether to skip the GOP or stream it.
This scheme can be further extended and a Group Header can be constructed. The Group Header will contain the Event types of the Events that have occurred during the time interval determined by the first GOP to the last GOP of the file. The Group Header can be tagged at the beginning of the File. Once the navigator gets the File path from the database, the Group header of the file is read and if the requested Event is not included in the Group Header, rest of the file contents should not be read.
The accompanying figure 2 shows the process flowchart to generate event clips.
The process of generating Event clips:
The video coming from Video sources (i.e. Cameras) are temporarily recorded in storage. The clip generators store Event metadata (Timestamp, Type of Event, position of objects in image frame, etc) in an Event queue.
The clip generator reads Video data from storage and constructs the GOP of Video. While constructing the GOP of video it gets the Event metadata from the Event queue, embeds the data in a Header of the clip and stores the Event clip in a persistent storage for further retrieval. The timestamp of the first and also that of the last frame is stored in the database against each such Event clip.
The process of identifying the clips:
The database contains the path of each Event clip (a file in computer) and the timestamp of the first and the last frame within the event clip. When one wants to watch the video against any event, it can consult database and locate the Event clip that contains the event metadata. The Event clip thus located can be played on a standard Media player.
The accompanying figure 3 shows the entire process in a flowchart.
,CLAIMS:WE CLAIM:
1. A method for generating event evidencing video clips against multiple events occurred in closely spaced timeline comprising
generating at least a video clip of certain time duration from a stream of video frames covering multiple events captured in the video stream that are closely spaced in timeline;
storing the video clip in a memory buffer under an event clip file with event metadata for all the events covered by the video clip containing event timestamps and event type information;
clip generator monitoring the memory buffer continuously for arrival of new event metadata in the event clip file whereby the event clip file is closed if event is not generated beyond pre-fixed time limit or the file-size goes beyond a pre-fixed size in bytes;
tagging all the events in the event clip file occurred within the timestamp of first frame and that of last frame in the event clip file as a form of header information in the event clip file against each Group of Pictures (GOP);
storing the tagged event clip file in a permanent storage element with a database record with searching attributes for later searching of the file by a navigator based on a request for relevant video against any particular event within a given span of time and to identify the file among multiple of such event clip files stored in the permanent storage that includes the time requested.

2. The method as claimed in claim 1, wherein the searching attributes for later searching of the event clip file includes the first frame start time and the last frame start time along with the file path in the permanent storage.

3. The method as claimed in claim 1 or 2, wherein the navigator after getting the event clip file path from the database reads the header of each GOP of the event clip file;
wherein each of the GOP contains byte offset of next GOP in the event clip file to enable quick navigation across successive GOPs; and
wherein each of the GOP header contains the event metadata to enable the navigator identify whether the GOP contains the requested type of events or not and to decide whether to skip the GOP or stream it.

4. The method as claimed in anyone of the claims 1 to 3, wherein a group header is constructed by involving the event types of the events that have occurred during the time interval determined by the first GOP to the last GOP of the event clip file.

5. The method as claimed in anyone of the claims 1 to 4, wherein the group header is tagged at the beginning of the event clip file and once the navigator gets the file path from the database, the group header of the event clip file is read and if the requested event is not included in the group header, rest of the event clip file contents are discarded from reading.

6. The method as claimed in anyone of the claims 1 to 5, wherein the video clip covering the events is generated by a clip generator cooperative to an imaging camera;
wherein the event metadata from one or more external image content analyzing system cooperative to the imaging camera is received in the memory buffer for generating the video clip as and when the events occur in real time and the same is detected by the content analyzing system, whereby the clip generator store the event metadata in an event queue and the event clip generator reads the video data stream and constructs the GOP of the video clip.

7. The method as claimed in anyone of the claims 1 to 6, wherein construction of the GOP includes the event metadata from the event queue which is embedded in the header of the event clip and the event clip stored in the permanent storage for further retrieval.

8. The method as claimed in anyone of the claims 1 to 7, includes a steaming device cooperative to the navigator enabling user to download only the required video frames instead of the complete video files for further processing, viz, rendering the video, or transcoding into other formats on demand.

9. A system for generating event evidencing video clips against multiple events occurred in closely spaced timeline involving the method as claimed in anyone of the claims 1 to 8 comprising
the clip generator for generating atleast a video clip of certain time duration from a stream of video frames covering multiple events captured in the video stream that are closely spaced in timeline including event metadata for all the events covered by the video clip containing event timestamps and event type information;
the memory buffer to store the video under an event clip file with the event metadata for all the events covered by the video clip;
said clip generator monitors the memory buffer continuously for arrival of new event metadata in the event clip file whereby the event clip file is closed if event is not generated beyond pre-fixed time limit or the file-size goes beyond a pre-fixed size in bytes and thereby tags all the events in the event clip file occurred within the timestamp of first frame and that of last frame in the event clip file as a form of header information in the event clip file against each GOP;
the permanent storage element to store the tagged event clip file with the database record having searching attributes for later searching of the file by the navigator based on a request for relevant video against any particular event within a given span of time and to identify the file among multiple of such event clip files stored in the permanent storage that includes the time requested.

Dated this 25th day of March, 2019 Anjan Sen
Of Anjan Sen & Associates
(Applicants Agent)
IN/PA-199

Documents

Application Documents

# Name Date
1 201831007199-IntimationOfGrant07-04-2025.pdf 2025-04-07
1 201831007199-STATEMENT OF UNDERTAKING (FORM 3) [26-02-2018(online)].pdf 2018-02-26
2 201831007199-PatentCertificate07-04-2025.pdf 2025-04-07
2 201831007199-PROVISIONAL SPECIFICATION [26-02-2018(online)].pdf 2018-02-26
3 201831007199-Written submissions and relevant documents [17-07-2024(online)].pdf 2024-07-17
3 201831007199-FORM 1 [26-02-2018(online)].pdf 2018-02-26
4 201831007199-DRAWINGS [26-02-2018(online)].pdf 2018-02-26
4 201831007199-Correspondence to notify the Controller [28-06-2024(online)].pdf 2024-06-28
5 201831007199-US(14)-HearingNotice-(HearingDate-03-07-2024).pdf 2024-06-10
5 201831007199-FORM-26 [23-05-2018(online)].pdf 2018-05-23
6 201831007199-Proof of Right (MANDATORY) [18-08-2018(online)].pdf 2018-08-18
6 201831007199-ABSTRACT [10-01-2023(online)].pdf 2023-01-10
7 201831007199-PostDating-(25-02-2019)-(E-6-11-2019-KOL).pdf 2019-02-25
7 201831007199-CLAIMS [10-01-2023(online)].pdf 2023-01-10
8 201831007199-COMPLETE SPECIFICATION [10-01-2023(online)].pdf 2023-01-10
8 201831007199-APPLICATIONFORPOSTDATING [25-02-2019(online)].pdf 2019-02-25
9 201831007199-ENDORSEMENT BY INVENTORS [25-03-2019(online)].pdf 2019-03-25
9 201831007199-FER_SER_REPLY [10-01-2023(online)].pdf 2023-01-10
10 201831007199-DRAWING [25-03-2019(online)].pdf 2019-03-25
10 201831007199-OTHERS [10-01-2023(online)].pdf 2023-01-10
11 201831007199-COMPLETE SPECIFICATION [25-03-2019(online)].pdf 2019-03-25
11 201831007199-FER.pdf 2022-07-12
12 201831007199-FORM 18 [31-12-2021(online)].pdf 2021-12-31
13 201831007199-COMPLETE SPECIFICATION [25-03-2019(online)].pdf 2019-03-25
13 201831007199-FER.pdf 2022-07-12
14 201831007199-DRAWING [25-03-2019(online)].pdf 2019-03-25
14 201831007199-OTHERS [10-01-2023(online)].pdf 2023-01-10
15 201831007199-ENDORSEMENT BY INVENTORS [25-03-2019(online)].pdf 2019-03-25
15 201831007199-FER_SER_REPLY [10-01-2023(online)].pdf 2023-01-10
16 201831007199-APPLICATIONFORPOSTDATING [25-02-2019(online)].pdf 2019-02-25
16 201831007199-COMPLETE SPECIFICATION [10-01-2023(online)].pdf 2023-01-10
17 201831007199-CLAIMS [10-01-2023(online)].pdf 2023-01-10
17 201831007199-PostDating-(25-02-2019)-(E-6-11-2019-KOL).pdf 2019-02-25
18 201831007199-ABSTRACT [10-01-2023(online)].pdf 2023-01-10
18 201831007199-Proof of Right (MANDATORY) [18-08-2018(online)].pdf 2018-08-18
19 201831007199-US(14)-HearingNotice-(HearingDate-03-07-2024).pdf 2024-06-10
19 201831007199-FORM-26 [23-05-2018(online)].pdf 2018-05-23
20 201831007199-DRAWINGS [26-02-2018(online)].pdf 2018-02-26
20 201831007199-Correspondence to notify the Controller [28-06-2024(online)].pdf 2024-06-28
21 201831007199-Written submissions and relevant documents [17-07-2024(online)].pdf 2024-07-17
21 201831007199-FORM 1 [26-02-2018(online)].pdf 2018-02-26
22 201831007199-PROVISIONAL SPECIFICATION [26-02-2018(online)].pdf 2018-02-26
22 201831007199-PatentCertificate07-04-2025.pdf 2025-04-07
23 201831007199-STATEMENT OF UNDERTAKING (FORM 3) [26-02-2018(online)].pdf 2018-02-26
23 201831007199-IntimationOfGrant07-04-2025.pdf 2025-04-07
24 201831007199-RELEVANT DOCUMENTS [13-06-2025(online)].pdf 2025-06-13

Search Strategy

1 SearchStrategyE_12-07-2022.pdf

ERegister / Renewals

3rd: 13 Jun 2025

From 26/03/2020 - To 26/03/2021

4th: 13 Jun 2025

From 26/03/2021 - To 26/03/2022

5th: 13 Jun 2025

From 26/03/2022 - To 26/03/2023

6th: 13 Jun 2025

From 26/03/2023 - To 26/03/2024

7th: 13 Jun 2025

From 26/03/2024 - To 26/03/2025

8th: 13 Jun 2025

From 26/03/2025 - To 26/03/2026