Sign In to Follow Application
View All Documents & Correspondence

Viewer Exposure Measurement System

Abstract: Disclosed is viewer exposure measurement system (100, 200) comprising: motion sensor(s) (102), each motion sensor positioned in a distinct region of a designated viewing area, and configured to detect a movement of potential viewer (214) within the designated viewing area; and processing unit (104, 212), communicably coupled to the motion sensor(s), configured to: acquire motion sensor data from each of the motion sensor(s), wherein the motion sensor data comprises timestamps of movement events of potential viewers, identify a set of consecutive movement events of a given potential viewer from each of the motion sensor(s) occurring within a specific timeframe, to determine a viewing time and a dwell time of the given potential viewer in between the set of consecutive movement events, and determine a number of the potential viewers during a pre-defined timeline based on a discreet sets of consecutive movement events of the potential viewers within designated viewing area. FIG. 1

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
13 February 2024
Publication Number
33/2025
Publication Type
INA
Invention Field
PHYSICS
Status
Email
Parent Application

Applicants

ADONMO PRIVATE LIMITED
Aparna Cyberzon, AB 104, Near Citizen Hospital, Serilingampally, Hyderabad, Rangareddi, India

Inventors

1. Anmol Srivastava
Flat no. 70/2, Sector-1, Aditya World City, NH-24, Ghaziabad-201002
2. Aditya Patwardhan
House E3, Shalimar Garden, Kolar Road, Bhopal

Specification

Description:TECHNICAL FIELD
The present disclosure relates to a viewer exposure measurement system.
BACKGROUND
Digital Out-of-Home (DOOH) advertising has become a prominent marketing channel, reaching audiences (namely, users, viewers) in key locations such as transportation hubs, retail stores, public buildings, and so forth. In an example, the DOOH is used to run or present campaigns. The campaigns encompass various marketing messages, promotions, or brand awareness initiatives delivered to a target audience through the DOOH. In this regard, determining the effectiveness of these campaigns relies on accurate measurements of viewer exposure. Moreover, in the DOOH sector, advertisement selling is based on the number of advertisement impressions delivered and the potential viewing time. Currently, there has not been a simple yet effective method to measure the viewership. This is a huge pain point for advertisers (namely, clients) who ultimately end up paying for more than what was promised to be delivered.
Currently, camera-based systems are being used that utilize computer vision algorithms to detect and track the viewers in the vicinity of the advertising screen and thereby measuring the viewer’s exposure time. While offering detailed insights into the viewer demographics and attention patterns, the camera-based systems raise privacy concerns and require substantial computational resources. However, the computer vision algorithms are expensive and sometimes infeasible solutions. Additionally, a large number of sensors are being used for collecting data and then implementing sophisticated machine learning algorithms to estimate the viewership metrics such as opportunity to see, no. of potential views, viewing frequency, and so forth. However, developing sophisticated machine learning algorithms requires huge technical effort and is expensive.
Therefore, in light of the foregoing discussion, there exists a need to overcome the aforementioned drawbacks associated with the conventional systems for monitoring the viewership metrics.
SUMMARY
The present disclosure provides a viewer exposure measurement system. The present disclosure provides a solution to the technical problem of accurately quantifying advertisement viewership simplistically and affordably, without breaching the viewers' privacy. An aim of the present disclosure is to provide a solution that overcomes at least partially the problems encountered in the prior art and provide an improved viewer exposure measurement system that facilitates a cost-effective and easy-to-implement solution for real-time viewer engagement monitoring.
One or more objectives of the present disclosure are achieved by the solutions provided in the enclosed independent claims. Advantageous implementations of the present disclosure are further defined in the dependent claims.
In one aspect, the present disclosure provides a viewer exposure measurement system comprising:
at least two motion sensors, each motion sensor positioned in a distinct region of a designated viewing area, and configured to detect a movement of a potential viewer within the designated viewing area; and
a processing unit, communicably coupled to the at least two motion sensors, configured to:
acquire motion sensor data from each of the at least two motion sensors, wherein the motion sensor data comprises timestamps of a plurality of movement events of a plurality of potential viewers,
identify a set of consecutive movement events of a given potential viewer from each of the at least two motion sensors occurring within a specific timeframe, to determine a viewing time and a dwell time of the given potential viewer in between the set of consecutive movement events, and
determine a number of the plurality of potential viewers during a pre-defined timeline based on a plurality of discreet sets of consecutive movement events of the plurality of potential viewers within the designated viewing area.
The aforementioned viewer exposure measurement system employs the at least two motion sensors that cover distinct regions and identify the sets of consecutive events within a specific timeframe. Advantageously, the at least two motion sensors can accurately differentiate between the individual viewers and the movements thereof, even when overlapping across the designated viewing area. This, combined with the pre-defined timeline analysis, allows for a nuanced understanding of the viewer engagement. Moreover, the aforementioned viewer exposure measurement system employs the processor that calculates both the viewing time for short movements across the area and the dwell time for longer movements through it, providing a richer picture of engagement compared to solely focusing on presence. Additionally, by analyzing discreet sets of consecutive events instead of continuous movement, the viewer exposure measurement system reduces the impact of false positives and improves overall accuracy. The aforementioned combination of features leads to a more precise, cost-effective, and privacy-friendly solution for measuring viewer exposure, thus fostering better advertising campaign optimization and resource allocation.
It is to be appreciated that all the aforementioned implementation forms can be combined. All steps which are performed by the various entities described in the present application as well as the functionalities described to be performed by the various entities are intended to mean that the respective entity is adapted to or configured to perform the respective steps and functionalities. It will be appreciated that features of the present disclosure are susceptible to being combined in various combinations without departing from the scope of the present disclosure as defined by the appended claims.
Additional aspects, advantages, features and objects of the present disclosure would be made apparent from the drawings and the detailed description of the illustrative implementations construed in conjunction with the appended claims that follow.
BRIEF DESCRIPTION OF THE DRAWINGS
FIG. 1 is an illustration of a block diagram of a viewer exposure measurement system, in accordance with an embodiment of the present disclosure; and
FIG. 2 is an illustration of a viewer exposure measurement system being installed in an environment, in accordance with an embodiment of the present disclosure.
The summary above, as well as the following detailed description of illustrative embodiments, is better understood when read in conjunction with the appended drawings. For the purpose of illustrating the present disclosure, exemplary constructions of the disclosure are shown in the drawings. However, the present disclosure is not limited to specific methods and instrumentalities disclosed herein. Moreover, those skilled in the art will understand that the drawings are not to scale. Wherever possible, like elements have been indicated by identical numbers.
Embodiments of the present disclosure will now be described, by way of example only, with reference to the following diagrams wherein:
In the accompanying drawings, an underlined number is employed to represent an item over which the underlined number is positioned or an item to which the underlined number is adjacent. A non-underlined number relates to an item identified by a line linking the non-underlined number to the item. When a number is non-underlined and accompanied by an associated arrow, the non-underlined number is used to identify a general item at which the arrow is pointing.
DETAILED DESCRIPTION OF EMBODIMENTS
The following detailed description illustrates embodiments of the present disclosure and ways in which they can be implemented. Although some modes of carrying out the present disclosure have been disclosed, those skilled in the art would recognize that other embodiments for carrying out or practicing the present disclosure are also possible.
FIG. 1 is a block diagram of a viewer exposure measurement system 100, in accordance with an embodiment of the present disclosure. With reference to FIG. 1, there is shown a block diagram of the viewer exposure measurement system 100. The viewer exposure measurement system 100 comprises at least two motion sensors (depicted as 102) and a processing unit 104. The term "viewer exposure measurement system" 100 as used herein refers to a technologically advanced apparatus designed for quantifying and analyzing the engagement of potential viewers within a designated viewing area. The term "designated viewing area" as used herein refers to a predefined space or region where an advertising screen or display is positioned. The advertising screen or display is monitored by the viewer exposure measurement system 100. Optionally, the designated viewing area represents the region where potential viewers are expected to engage with the displayed content, and within which the at least two motion sensors 102 are strategically placed to detect and analyze movements, providing data related to the viewer exposure. The term "viewer" as used herein refers to an individual within the designated viewing area whose movements are detected and analyzed by the at least two motion sensors 102 of the viewer exposure measurement system 100. The viewer encompasses individuals who may be present in proximity to the advertising screen or the display, and their interactions and movements are assessed to quantify engagement and measure exposure to the displayed content.
The term "motion sensor" as used herein refers to a device designed to detect physical movement within its surrounding environment. Typically, the at least two motion sensors 102 operate by sensing changes in infrared radiation, sound waves, or other relevant stimuli. In this regard, each of the at least two motion sensors 102 is positioned in a distinct region of the designated viewing area, and configured to detect the movement of the potential viewer within the designated viewing area. Herein, the term "distinct region" refers to a specific and delineated area within the designated viewing space. The distinct region ensures that their coverage areas do not overlap with each other. The at least two motion sensors are positioned to cover a detection area within the distinct region of the designated viewing area, to detect the movement of the potential viewer within the detection area.
Optionally, the movement of a given potential viewer within the designated viewing area comprises a movement across the designated viewing area and/or a movement through the designated viewing area. In this regard, the movement across the designated viewing area implies any motion that occurs while the viewer is within the boundaries of the designated area. Moreover, the movement through the designated viewing area implies actions involving entry into or exit from the specified viewing space. It will be appreciated that by considering both the aforementioned movements that are across and through the designated viewing area, the viewer exposure measurement system 100 captures a more nuanced view of the given potential viewer interactions, thus accommodating diverse scenarios. In an implementation, the viewer exposure measurement system 100 utilizes the at least two motion sensors 102 that are strategically placed around the designated viewing area to detect and record the movements. In an example, a PIR sensor could detect the movements across the designated viewing area, while an ultrasonic sensor could detect the movements through the entrance, thus providing a comprehensive view of the given potential viewer activities and the viewer engagement patterns.
Optionally, the at least two motion sensors 102 are selected from: an infrared (IR) motion sensor, a passive infrared (PIR) sensor, a microwave sensor, an acoustic sensor, a tomographic sensor, a proximity sensor, an ultrasonic sensor, a capacitive sensor, an optical sensor, or a combination thereof. Herein, the term infrared (IR) motion sensor refers to a sensor that utilizes infrared radiation to detect changes in temperature caused by the movement of a subject (namely, a person, a viewer, a user). The technical effect of using the infrared (IR) motion sensor is that such sensors are efficient for detecting heat-emitting subjects in the designated viewing area, thus providing reliable movement data for the viewer. Herein, the term "passive infrared (PIR) sensor" refers to a sensor that senses the infrared radiation emitted by the subject within its field of view and triggers a response when changes are detected. Advantageously, the passive infrared sensor is effective in detecting the movement of the viewer due to the body's heat emission thereof, enhancing accuracy in the viewer tracking. Herein, the term "microwave sensor" refers to a sensor that emits microwaves and analyses the reflected signals to identify moving viewers. Beneficially, the microwave sensor is used for detecting motion regardless of temperature variations, ensuring consistent performance in different environmental conditions.
Herein the term "acoustic sensor" refers to a sensor that detects motion through sound waves or vibrations. The acoustic sensor is useful for scenarios where visual detection might be challenging, thus providing an alternative method to identify the movement of the subject. Herein the term "tomographic sensor" refers to a sensor that utilizes radio waves to create a three-dimensional image of the environment, detecting changes in the pattern. The tomographic sensor provides a comprehensive view of motion, beneficial for precise tracking within the designated area. Herein, the term "proximity sensor" refers to a sensor that detects the presence or absence of the viewer within a certain range without direct contact. The proximity sensor aids in precise data collection. Herein, the term "ultrasonic sensor" refers to a sensor that emits ultrasonic waves and measures the reflection of the ultrasonic waves to identify the position of the subject. The ultrasonic sensor is effective in capturing movement, especially in environments with obstacles or varying light conditions. Herein, the term "capacitive sensor" refers to a sensor that detects changes in capacitance caused by nearby viewers. The capacitive sensor is sensitive to subtle movements, enhancing the viewer exposure measurement system's 100 ability to capture fine-grained viewer interactions. Herein, the term "optical sensor" refers to a sensor that relies on light detection to identify changes in the optical environment. The optical sensor is useful for scenarios where visual confirmation of motion is critical, contributing to a comprehensive data acquisition. Optionally, the viewer exposure measurement system 100 comprises two sensors such as the PIR sensor and the ultrasonic sensor. Optionally, the aforementioned sensors are used in any combination to ensure optimal performance and data quality in diverse real-world applications.
The term "processing unit" 104 as used herein refers to an application, program, or device that responds to requests for information or services by another application, program, process or device (such as the external device) via a network interface. Optionally, the processing unit 104 also encompasses software that makes the act of serving information or providing services possible. It will be appreciated that optionally the processing unit 104 includes, but is not limited to, a microprocessor, a microcontroller, a complex instruction set computing (CISC) microprocessor, a reduced instruction set computer (RISC) microprocessor, a very long instruction word (VLIW) microprocessor, or any other type of processing circuit. The processing unit 104 is communicably coupled to the at least two motion sensors 102. Optionally, the processing unit 104 can communicate with the at least two motion sensors 102 using a wired communication or a wireless communication.
Herein, the term "acquire" refers to a process of receiving and reading an output such as the motion sensor data generated by each of the at least two motion sensors 102 deployed in the designated viewing area. In an example, the processing unit 104 could acquire the motion sensor data using direct wire connections or cables such as a universal serial bus (USB) or a serial peripheral interface (SPI). In another example, the processing unit 104 could acquire the motion sensor data using the wireless communication such as bluetooth®, Wi-Fi, and so forth.
Herein, the term "motion sensor data" refers to data received from each of the at least two motion sensors 102. The motion sensor data comprises the timestamps of the plurality of movement events of the plurality of potential viewers. Herein, the timestamps refer to precise time markers indicating when a movement event was detected. Optionally, the motion sensor data include a sensor-specific information such as signal strength, direction of movement, or additional environmental data. In an example, the potential viewer enters the designated viewing area at 10:01:03 AM, the at least two motion sensors 102 detect the movement and records the timestamp. Now, when the potential viewer moves across the designated area and triggers the at least two motion sensors 102 at 10:01:08 AM, the at least two motion sensors 102 also records the timestamp. Beneficially, the timestamps allow the processing unit 104 to analyze the temporal sequence of the plurality of movement events across the at least two motion sensors 102 and accurately identify individual viewers and their movements.
Herein, the term "movement events" refers to any detection of activity within a given motion sensor's range that indicates the potential presence or movement of the potential viewer. In an example, when the at least one motion sensor is a PIR sensor then the movement event could be triggered by a change in infrared radiation thus indicating the potential viewer entering the given motion sensor's field of view. In an example, the potential viewer could actively engage or not engage with the DOOH content.
The processing unit 104 is configured to identify a set of consecutive movement events of a given potential viewer from each of the at least two motion sensors 102 occurring within a specific timeframe, to determine a viewing time and a dwell time of the given potential viewer in between the set of consecutive movement events. Herein, the set of consecutive movement events refers to a series of the movement events detected by the at least two motion sensors 102, indicating a continuous presence or activity of the given potential viewer within the designated viewing area. For example, the set of consecutive movement events is generated when the potential viewer walks through the designated viewing area, triggering two motion sensors from the at least two motion sensors 102 in sequence within a few seconds. Herein, the term "specific timeframe" refers to a predetermined time period within which the viewer exposure measurement system 100 analyzes the set of consecutive movement events to determine viewer exposure metrics. For example, the viewer exposure measurement system 100 might analyze the motion sensor data in 5 seconds intervals to calculate the viewing time and the dwell times. The term "viewing time" as used herein refers to a total duration for which a potential viewer is detected within the designated viewing area, regardless of the movement patterns thereof. For example, the potential viewer standing still in front of the DOOH display for 30 seconds would have a viewing time of 30 seconds. The term "dwell time" as used herein refers to a time spent by a potential viewer actively moving through the designated viewing area, indicating a higher level of engagement with the content. For example, a potential viewer walking past the DOOH display and glancing at it for 5 seconds would have the dwell time of 5 seconds.
In this regard, the processing unit 104 analyzes the motion sensor data (such as the timestamps and IDs of the at least two sensors) to identify sets of consecutive movement events that meet specific timeframe. In an example, the processing unit 104 filters the set of consecutive movement events based on the defined specific timeframe. The processing unit 104 sums up the total duration between the first event and the last event in a set to determine the viewing time. Moreover, the processing unit 104 measures the time between the set of consecutive movement events that were detected by different sensors from the at least two motion sensors 102, indicating movement and active engagement, to calculate the dwell time.
It will appreciated that by identifying the sets of consecutive movement events and applying the specific timeframe, the processing unit 104 effectively distinguishes the given potential viewer and the engagement patterns thereof, providing more nuanced insights into the behavior of the given potential viewer than simple presence detection.
Optionally, the viewing time of the potential viewer in between the set of consecutive movement events across the designated viewing area occurs within the specific timeframe ranging from 10 milliseconds to 5 seconds, and the dwell time of the potential viewer in between the set of consecutive movement events through the designated viewing area occurs within the specific timeframe of at least 5 seconds. Optionally, the viewing time is in the range of 10 milliseconds, 20 milliseconds, 50 milliseconds, 100 milliseconds, 200 milliseconds, 500 milliseconds (or 0.5 seconds), 1 second, 2 seconds, 3 seconds, or 4 seconds upto 20 milliseconds, 50 milliseconds, 100 milliseconds, 200 milliseconds, 500 milliseconds (or 0.5 seconds), 1 second, 2 seconds, 3 seconds, 4 seconds or 5 seconds. The aforementioned range of the viewing time accommodates a wide spectrum of potential viewing scenarios, from very brief interactions (such as, 10 milliseconds) to more extended periods (up to 5 seconds). The technical effect of the aforementioned range is the adaptability of the viewer exposure measurement system 100 to different viewing patterns, ensuring accurate and comprehensive measurement of viewer exposure across various timeframes. Optionally, the dwell time of the given potential viewer in between the set of consecutive movement events through the designated viewing area occurs within the specific timeframe of 5 seconds, 6 seconds, 7 seconds, and so forth.
Optionally, each of the plurality of discreet sets of consecutive movement events of the plurality of potential viewers is separated by at least 7 seconds. In this regard, the processing unit 104, identifies and categorizes consecutive movement events into discreet sets, ensuring that the time gap between each of the plurality of discreet sets of consecutive movement events sets is at least 7 seconds. The technical effect of the aforementioned time gap is in preventing the merging or overlap the plurality of discreet sets of consecutive movements, providing a clear delineation between individual instances of viewer activity. It will be appreciated that the aforementioned time gap contributes to accurate data interpretation and avoids potential errors that might arise from closely spaced or continuous movements being considered as a single event.
The processing unit 104 is configured to determine the number of the plurality of potential viewers during the pre-defined timeline based on the plurality of discreet sets of consecutive movement events of the plurality of potential viewers within the designated viewing area. In this regard, the processing unit 104 is employed to quantify and track the number of potential viewers over the pre-defined timeline, providing insights into the plurality of potential viewers and overall exposure to the designated viewing area. In this regard, the processing unit 104 analyzes the discreet sets of consecutive movement events, associating them with individual potential viewers, and calculating the total number of the plurality of potential viewers within the predefined timeline.
Beneficially, determining the number of the plurality of potential viewers using the processing unit 104 enables generation of the valuable metrics, such as the plurality of potential viewers, contributing to a comprehensive understanding of viewer engagement patterns. Advantageously, the processing unit 104 facilitates data-driven decision-making in areas such as advertising effectiveness and audience reach within the designated viewing area.
Optionally, the viewer exposure measurement system 100 further comprises a display unit 106 configured to display the viewing time, the dwell time, and the number of the plurality of potential viewers in real-time or over the pre-defined timeline. The term "display unit" 106 as used herein refers to a visual interface that presents information in a human-readable format. For example, the display unit 106 could be a digital screen or any output mechanism capable of showcasing data. In an example, the display unit 106 is positioned near the designated viewing area to display real-time metrics such as the viewing time, the dwell time, and the number of the plurality of potential viewers.
The display unit 106 can exhibit the viewing time, the dwell time, and the number of the plurality of potential viewers in various formats, including numerical values, graphical representations, or a combination thereof. Herein, the real-time metrics refer to the continuous display of metrics as they occur, providing immediate feedback to observers. Additionally, the display unit 106 can present historical data over a pre-defined timeline, offering insights into viewer patterns. Moreover, the inclusion of the display unit 106 enhances the viewer exposure measurement system's 100 usability by providing stakeholders, such as the advertisers or system operators, with instantaneous and historical data. The visual representation facilitates quick decision-making, allowing the users to adapt strategies based on real-time viewer engagement and assess the overall performance of the advertising content over specific time periods.
Optionally, the viewer exposure measurement system 100 further comprises a communication interface 108 configured to transmit the viewing time, the dwell time, and the number of the plurality of potential viewers to a remote server, wherein the communication interface 108 is communicably coupled with the processing unit 104 and the remote server 110. The term "communication interface" 108 as used herein refers to a component that facilitates the exchange of data between different parts of the viewer exposure measurement system 100. The communication interface 108 enables seamless communication between the processing unit 104 and an external entity such as the remote server 110. Examples of the communication interface 108 include but are not limited to a wired connection such as an internet, an ethernet, USB, or wireless technologies such as the Wi-Fi, the bluetooth®, or cellular communication. The term "remote server" 110 as used herein refers to a centralized computing resource located at a distance from the viewer exposure measurement system 100. The remote server 110 typically hosts applications or services that can process, and store data received from multiple systems 100.
Optionally, the remote server 110 is responsible for aggregating data from various viewer exposure measurement systems 100, enabling centralized analytics, and supporting broader insights into the behavior of the plurality of potential viewers. The communication interface 108 is configured to transmit the viewing time, the dwell time, and the number of the plurality of potential viewers from the processing unit 104 to the remote server 110. Optionally, the transmission can occur in real-time or periodically, depending on the design of the viewer exposure measurement system 100. The inclusion of the communication interface 108 enhances the scalability and data management capabilities of the viewer exposure measurement system 100. The transmission allows for more robust and comprehensive insights into viewer engagement across multiple designated viewing areas, contributing to informed decision-making and strategic planning for the advertisers or the system operators.
Optionally, the viewer exposure measurement system 100 further comprises at least one additional sensor positioned in proximity to the designated viewing area, said additional sensor configured to provide complementary data for refining an identification of the given potential viewer and tracking of the movement events. The term "additional sensor" as used herein refers to a supplementary sensing device placed near the designated viewing area. The at least one additional sensor is designed to capture data that complements the information obtained from the at least two motion sensors 102, aiding in a more accurate identification of the potential viewers and better tracking of the movement events. The at least one additional sensor enhances the accuracy and depth of the collected motion sensor data. Optionally, the complementary data includes data on environmental conditions, such as ambient light levels, temperature, and so forth.
Beneficially, the refinement is essential for distinguishing between the potential viewers, especially in scenarios where multiple individuals may be present in the designated viewing area simultaneously. An example of the at least one additional sensor could be a light sensor, providing data on the ambient lighting conditions. This information may be valuable for understanding how external factors, such as changes in light, impact viewer interactions with the advertising display.
Optionally, the viewer exposure measurement system 100 further comprises a memory unit configured to store historical potential viewer data, facilitating long-term analysis and trend identification in viewer engagement. The term "memory unit" as used herein refers to a storage component within the viewer exposure measurement system 100. The memory unit is used to store and retain the historical potential viewer data related to the potential viewer interactions and engagements with the advertising display. Examples of the memory unit include non-volatile storage devices such as hard drives, solid-state drives (SSDs), or electronic memory modules such as random access memory (RAM). These components retain data even when the viewer exposure measurement system 100 is powered off.
The historical potential viewer data involves the systematic recording and retention of information gathered over time, including details about the potential viewer movements, the dwell times, and other relevant metrics. Moreover, by storing the historical potential viewer data, the memory unit enables long-term analysis of the viewer engagement patterns. This extended timeframe allows for the identification of trends, fluctuations, and recurring behaviors in how viewers interact with the advertising display. The stored historical data could include peak engagement periods, seasonal variations, or shifts in viewer preferences over extended periods.
Optionally, the processing unit 104 employs an algorithm that utilizes spatial information from each of the at least two motion sensors 102 to differentiate between the potential viewers and overlapping movement events. Herein the "algorithm" refers to a set of instructions or a computational procedure implemented by the processing unit 104. The algorithm utilizes spatial information gathered from each of the at least two motion sensors 102. Examples of algorithms could include pattern recognition algorithms, machine learning algorithms, or mathematical algorithms, and the like. The algorithms are designed to process and interpret the spatial data. The algorithms aim to differentiate between the potential viewers and overlapping the movement events. Optionally, the algorithm employed by the processing unit 104 leverages the spatial information provided by the at least two motion sensors 102. Optionally, the spatial information includes, but is not limited to, a position, a movement pattern, and a trajectory of the potential viewer within the designated viewing area.
Optionally, the algorithm analyzes the spatial data received from each of the at least two motion sensors 102, considering factors such as a location, a direction, and a speed of movement of the viewer. Optionally, the algorithm then applies predefined rules or patterns to distinguish between different viewers and identify individual movement events. Beneficially, by effectively utilizing the spatial information, the algorithm contributes to the precision and reliability of viewer exposure measurements, providing more accurate data for analysis and reporting.
FIG. 2 is an illustration of a viewer exposure measurement system 200 being installed in an environment 202, in accordance with an embodiment of the present disclosure. With reference to FIG. 2, there is shown the viewer exposure measurement system 200 that comprises a first motion sensor 204 and a second motion sensor 206, wherein each of the first motion sensor 204 and the second motion sensor 206 are deployed on each side of an elevator entrance 208. Herein, the first motion sensor 204 and the second motion sensor 206 are a passive infrared sensor (PIR). Moreover, the viewer exposure measurement system 200 comprises a third motion sensor 210. Herein, the third motion sensor 210 is an ultrasonic sensor mounted on a ceiling of the elevator entrance 208. Furthermore, the viewer exposure measurement system 200 comprises a processing unit 212 that is operatively coupled to the first motion sensor 204, the second motion sensor 206, and the third motion sensor 210. It will be appreciated that the combination of the first motion sensor 204, the second motion sensor 206 and the third motion sensor 210 creates a designated viewing area (namely a detection zone) around the elevator entrance 208. Additionally, the first motion sensor 204, the second motion sensor 206, and the third motion sensor 210 detect a movement of a potential viewer 214 within the designated viewing area. In this regard, the first motion sensor 204, and the second motion sensor 206 perform detection when the potential viewer 214 moves through the designated viewing area while the third motion sensor 210 performs detection when the potential viewer 214 enters or exits through the elevator entrance 208. In this regard, optionally, the first motion sensor 204, and the second motion sensor 206, and the third motion sensor 210 perform detection in a 90°-180° range thereof. More optionally, a vertical field of view and a horizontal field of view of the first motion sensor 204, and the second motion sensor 206, and the third motion sensor 210 could be close to 90°-180°. The processing unit 212 is configured to acquire motion sensor data. Optionally, the motion sensor data is uploaded to a remote server 216. Optionally, the motion sensor data is processed using a simple algorithm to determine a viewing time and a dwell time of the given potential viewer 214 in between the set of consecutive movement events, and determine a number of the plurality of potential viewers 214 during a pre-defined timeline based on a plurality of discreet sets of consecutive movement events of the plurality of potential viewers 214 within the designated viewing area. There is also shown a display unit 218 configured to display the viewing time, the dwell time, and the number of the plurality of potential viewers 214 in real-time or over the pre-defined timeline. Furthermore, there is also shown the potential viewer 214 viewing a DOOH advertising screen 220.
Modifications to embodiments of the present disclosure described in the foregoing are possible without departing from the scope of the present disclosure as defined by the accompanying claims. Expressions such as "including", "comprising", "incorporating", "have", "is" used to describe and claim the present disclosure are intended to be construed in a non-exclusive manner, namely allowing for items, components or elements not explicitly described also to be present. Reference to the singular is also to be construed to relate to the plural. The word "exemplary" is used herein to mean "serving as an example, instance or illustration". Any embodiment described as “exemplary” is not necessarily to be construed as preferred or advantageous over other embodiments and/or to exclude the incorporation of features from other embodiments. The word "optionally" is used herein to mean "is provided in some embodiments and not provided in other embodiments". It is appreciated that certain features of the present disclosure, which are, for clarity, described in the context of separate embodiments, may also be provided in combination in a single embodiment. Conversely, various features of the present disclosure, which are, for brevity, described in the context of a single embodiment, may also be provided separately or in any suitable combination or as suitable in any other described embodiment of the disclosure.
, Claims:CLAIMS
I/We claim:
1. A viewer exposure measurement system (100, 200) comprising:
at least two motion sensors (102), each motion sensor positioned in a distinct region of a designated viewing area, and configured to detect a movement of a potential viewer (214) within the designated viewing area; and
a processing unit (104, 212), communicably coupled to the at least two motion sensors, configured to:
acquire motion sensor data from each of the at least two motion sensors, wherein the motion sensor data comprises timestamps of a plurality of movement events of a plurality of potential viewers,
identify a set of consecutive movement events of a given potential viewer from each of the at least two motion sensors occurring within a specific timeframe, to determine a viewing time and a dwell time of the given potential viewer in between the set of consecutive movement events, and
determine a number of the plurality of potential viewers during a pre-defined timeline based on a plurality of discreet sets of consecutive movement events of the plurality of potential viewers within the designated viewing area.
2. The viewer exposure measurement system (100, 200) as claimed in claim 1, wherein the movement of a potential viewer (214) within the designated viewing area comprises a movement across the designated viewing area and/or a movement through the designated viewing area.
3. The viewer exposure measurement system (100, 200) as claimed in claim 2, wherein the viewing time of the given potential viewer in between a set of consecutive movement events across the designated viewing area occurs within the specific timeframe ranging from 10 milliseconds to 5 seconds, and the dwell time of the given potential viewer in between a set of consecutive movement events through the designated viewing area occurs within the specific timeframe of at least 5 seconds.
4. The viewer exposure measurement system (100, 200) as claimed in claim 1, wherein each of the plurality of discreet sets of consecutive movement events of the plurality of potential viewers (214) is separated by at least 7 seconds.
5. The viewer exposure measurement system (100, 200) as claimed in claim 1, wherein the at least two motion sensors (102) are selected from: an infrared (IR) motion sensor, a passive infrared (PIR) sensor, a microwave sensor, an acoustic sensor, a tomographic sensor, a proximity sensor, an ultrasonic sensor, a capacitive sensor, an optical sensor, or a combination thereof.
6. The viewer exposure measurement system (100, 200) as claimed in claim 1, further comprising a display unit (106, 218) configured to display the viewing time, the dwell time, and the number of the plurality of potential viewers (214) in real-time or over the pre-defined timeline.
7. The viewer exposure measurement system (100, 200) as claimed in claim 1, further comprising a communication interface (108) configured to transmit the viewing time, the dwell time, and the number of the plurality of potential viewers (214) to a remote server (110, 216), wherein the communication interface is communicably coupled with the processing unit (104, 212) and the remote server.
8. The viewer exposure measurement system (100, 200) as claimed in claim 1, further comprising at least one additional sensor positioned in proximity to the designated viewing area, said additional sensor configured to provide complementary data for refining an identification of the given potential viewer and tracking of the movement events.
9. The viewer exposure measurement system (100, 200) as claimed in claim 1, further comprising a memory unit configured to store historical potential viewer data, facilitating long-term analysis and trend identification in viewer engagement.
10. The viewer exposure measurement system (100, 200) as claimed in claim 1, wherein the processing unit (104, 212) employs an algorithm that utilizes spatial information from each of the at least two motion sensors (102) to differentiate between the potential viewers (214) and overlapping movement events.

Documents

Application Documents

# Name Date
1 202441009697-STATEMENT OF UNDERTAKING (FORM 3) [13-02-2024(online)].pdf 2024-02-13
2 202441009697-POWER OF AUTHORITY [13-02-2024(online)].pdf 2024-02-13
3 202441009697-FORM FOR SMALL ENTITY(FORM-28) [13-02-2024(online)].pdf 2024-02-13
4 202441009697-FORM FOR SMALL ENTITY [13-02-2024(online)].pdf 2024-02-13
5 202441009697-FORM 1 [13-02-2024(online)].pdf 2024-02-13
6 202441009697-FIGURE OF ABSTRACT [13-02-2024(online)].pdf 2024-02-13
7 202441009697-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [13-02-2024(online)].pdf 2024-02-13
8 202441009697-EVIDENCE FOR REGISTRATION UNDER SSI [13-02-2024(online)].pdf 2024-02-13
9 202441009697-DRAWINGS [13-02-2024(online)].pdf 2024-02-13
10 202441009697-DECLARATION OF INVENTORSHIP (FORM 5) [13-02-2024(online)].pdf 2024-02-13
11 202441009697-COMPLETE SPECIFICATION [13-02-2024(online)].pdf 2024-02-13