Abstract: A MONITORING SYSTEM AND A METHOD FOR MONITORING DISPLAY ON A HOARDING AT A LOCATION ABSTRACT The present disclosure relates to a monitoring system (104) and a method for monitoring a display (103) on a hoarding (102) at a location. Capturing unit (101) captures real-time image (206) of the display (103) with timestamp, for a predefined number of days. Processing unit (105) is paired with the capturing unit (101) using respective Media Access Control (MAC) address. The processing unit (105) receives the real-time image (206) with the timestamp. The real-time image (206) is compared with pre-stored images (207) of the display (103) on the hoarding (102). An alert (209) is provided to manage the display (103), based on the comparison. The paired capturing and processing unit is geo-tagged with the location of the hoarding (102). Each of the processing unit (105) and the capturing unit (101) is configured to operate based on the pairing and the geo-tagging, for monitoring the hoarding (102). Figure 1B
DESC:FORM 2
THE PATENTS ACT 1970
[39 OF 1970]
&
THE PATENTS RULES, 2003
COMPLETE SPECIFICATION
[See section 10 and Rule 13]
TITLE: “A MONITORING SYSTEM AND A METHOD FOR MONITORING DISPLAY ON A HOARDING AT A LOCATION”
Name and Address of the Applicant:
TITAN COMPANY LIMITED
“Integrity” No.193, Veerasandra, Electronics City P.O., Off Hosur Main Road,
Bangalore - 560100
Nationality: Indian
The following specification particularly describes the invention and the manner in which it is to be performed.
TECHNICAL FIELD
[001] The present disclosure relates to monitoring of hoardings. More particularly, the present disclosure relates to a monitoring system and a method for monitoring a display on the hoarding at a location.
BACKGROUND
[002] Hoardings are large outdoor advertisement structures that display advertisements of various products, services, news, and the like. The hoardings enable a company, a business, and such entities, to capture attention of viewers towards the advertisements. The viewers include, but not limited to pedestrians and drivers. The hoardings are positioned at varied locations including, but not limited to, alongside a road, in high-traffic areas, in a stadium and so on. Effectiveness of the hoardings to capture attention of viewers depends on its placements. The hoardings are placed at various angles with respect to the viewers for maximizing visibility. The company contracts a hoarding agency for advertising services. A cost agreement between the company and the hoarding agency is based on size of the hoarding, location of the hoarding and a set duration for advertising the products of the company. However, placement of different hoarding than the one intended during the set duration can lead to enormous loss to a company in terms of brand value and money spent. Hence, there is a need for monitoring of the hoardings to prevent the placement of different hoardings than the intended hoardings, during the set duration.
[003] Traditional techniques for monitoring the hoardings comprises using cameras to capture real-time images/videos of the hoardings. The hoardings are monitored based on the real-time images/videos and previously captured images/videos. However, in the traditional techniques, the cameras placed near each hoarding at different locations are not secured. The problems such as stealing of the cameras, tampering of the cameras, and the like are not addressed in the traditional techniques. This may hinder the monitoring of the hoardings and may cost to the company. Hence, there is a need for an improved system for increasing the security of components of a system used for monitoring the hoardings.
[004] The information disclosed in this background of the disclosure section is only for enhancement of understanding of the general background of the invention and should not be taken as an acknowledgement or any form of suggestion that this information forms the prior art already known to a person skilled in the art.
SUMMARY
[005] In an embodiment, the present disclosure discloses a monitoring system for monitoring a display on a hoarding at a location. The monitoring system comprises a capturing unit and a processing unit. The capturing unit is configured to capture at least one real-time image of the display on the hoarding at the location. The at least one real-time image is captured with a timestamp, for a predefined number of days. The processing unit is paired with the capturing unit using respective Media Access Control (MAC) address. The processing unit is configured to receive the at least one real-time image with the timestamp, from the capturing unit. Further, the processing unit is configured to compare the at least one real-time image with one or more pre-stored images of the display on the hoarding, for a predefined number of days. Thereafter, the processing unit is configured to provide an alert to manage the display on the hoarding at the location, based on the comparison. The paired capturing and processing unit is geo-tagged with the location of the hoarding. Each of the processing unit and the capturing unit is configured to operate based on the pairing and the geo-tagging, for monitoring the hoarding.
[006] In an embodiment, the present disclosure discloses a method for monitoring a display on a hoarding at a location using a monitoring system. The monitoring system comprises a capturing unit and a processing unit. The processing unit is paired with the capturing unit, using respective Media Access Control (MAC) address. The method comprises capturing, by the capturing unit, at least one real-time image of the display on the hoarding at the location. The at least one real-time image is captured with a timestamp, for a predefined number of days. Further, the method comprises receiving, by the processing unit, the at least one real-time image with the timestamp, from the capturing unit. Furthermore, the method comprises comparing, by the processing unit, the at least one real-time image with one or more pre-stored images of the display on the hoarding. Thereafter, the method comprises providing, by the processing unit, an alert to manage the display on the hoarding at the location, based on the comparison. The paired capturing and processing unit is geo-tagged with the location of the hoarding. Each of the processing unit and the capturing unit is configured to operate based on the pairing and the geo-tagging, for monitoring the hoarding.
[007] The foregoing summary is illustrative only and is not intended to be in any way limiting. In addition to the illustrative aspects, embodiments, and features described above, further aspects, embodiments, and features will become apparent by reference to the drawings and the following detailed description.
BRIEF DESCRIPTION OF THE ACCOMPANYING DRAWINGS
[008] The novel features and characteristics of the disclosure are set forth in the appended claims. The disclosure itself, however, as well as a preferred mode of use, further objectives, and advantages thereof, will best be understood by reference to the following detailed description of an illustrative embodiment when read in conjunction with the accompanying figures. One or more embodiments are now described, by way of example only, with reference to the accompanying figures wherein like reference numerals represent like elements and in which:
[009] Figures 1A, 1B, and 1C show exemplary environment illustrating monitoring of a display on a hoarding at a location, in accordance with some embodiments of the present disclosure;
[0010] Figure 2 illustrates internal architecture of a processing unit, in accordance with some embodiments of the present disclosure;
[0011] Figure 3 shows an exemplary flow chart illustrating method steps for monitoring a display on a hoarding at a location, in accordance with some embodiments of the present disclosure;
[0012] Figures 4A, 4B, 4C and 4D show exemplary illustrations for monitoring a display on a hoarding at a location, in accordance with some embodiments of the present disclosure;
[0013] Figure 5 shows exemplary central dashboard, for monitoring multiple hoardings installed at different locations, in accordance with some embodiments of the present disclosure; and
[0014] Figure 6 shows a block diagram of a general-purpose computing system for monitoring a display on a hoarding at a location, in accordance with embodiments of the present disclosure.
[0015] It should be appreciated by those skilled in the art that any block diagram herein represents conceptual views of illustrative systems embodying the principles of the present subject matter. Similarly, it will be appreciated that any flow charts, flow diagrams, state transition diagrams, pseudo code, and the like represent various processes which may be substantially represented in computer readable medium and executed by a computer or processor, whether or not such computer or processor is explicitly shown.
DETAILED DESCRIPTION
[0016] In the present document, the word "exemplary" is used herein to mean "serving as an example, instance, or illustration." Any embodiment or implementation of the present subject matter described herein as "exemplary" is not necessarily to be construed as preferred or advantageous over other embodiments.
[0017] While the disclosure is susceptible to various modifications and alternative forms, specific embodiment thereof has been shown by way of example in the drawings and will be described in detail below. It should be understood, however that it is not intended to limit the disclosure to the particular forms disclosed, but on the contrary, the disclosure is to cover all modifications, equivalents, and alternatives falling within the scope of the disclosure.
[0018] The terms “comprises”, “comprising”, or any other variations thereof, are intended to cover a non-exclusive inclusion, such that a setup, device or method that comprises a list of components or steps does not include only those components or steps but may include other components or steps not expressly listed or inherent to such setup or device or method. In other words, one or more elements in a system or apparatus proceeded by “comprises… a” does not, without more constraints, preclude the existence of other elements or additional elements in the system or apparatus.
[0019] Embodiments of the present disclosure relate to a monitoring system and a method for monitoring a display on a hoarding at a location. The objectives of the proposed monitoring system include monitoring the display on the hoarding, and to increase security of components of the monitoring system. The monitoring system includes a capturing unit and a processing unit. The capturing unit captures images of the display on the hoarding. The processing unit receives the captured images. The processing unit compares the images with pre-stored images of the display. Alerts are provided based on the comparison. Hence, the display on the hoarding is monitored. Further, the present disclosure discloses pairing and location geo-tagging of the capturing unit and the processing unit. This enhances the security of the monitoring system and ensures the monitoring of the display on the hoarding at the location.
[0020] Figures 1A, 1B, and 1C show exemplary environment 100 illustrating monitoring of a display on a hoarding at a location. The environment 100 shown in Figure 1A comprises a hoarding 102, a display 103 on the hoarding 102, and a capturing unit 101. The hoarding 102 may be a digital hoarding, a poster hoarding, a painted hoarding, and the like. A person skilled in the art will appreciate that hoardings are not limited to above mentioned types and hoardings may have any form of display. The display 103 on the hoarding 102 may be an advertisement of a company. For example, a product of a company may be displayed. In another example, an information of an upcoming event relating to the company or the product may be displayed. In an example as shown in Figure 1A, the display 103 illustrates a watch of Y brand and X model. The capturing unit 101 may be configured to capture at least one real-time image of the display 103 on the hoarding 102 at the location. The capturing unit 101 may capture the display 103 with a timestamp. The capturing unit 101 may be an image capturing unit or video capturing unit. In an embodiment, the capturing unit 101 may be camera. A person skilled in art may appreciate that other kinds of capturing unit may be used (e.g., thermal cameras, IR cameras, etc). The capturing unit 101 may be placed in front of the display 103 such that a focal view of the capturing unit 101 covers entire display 103. An exemplary focal view of the capturing unit 101 is represented by dotted lines in Figure 1A. In an embodiment, the capturing unit 101 may be configured to capture a video of the display 103. For example, the display 103 may be a digital display and may display a video of the product. The displayed video may be dynamic in nature. The capturing unit 101 may focus the entire display 103 and capture the video for complete duration of the video. In an embodiment, one or more capturing units 101 may be used for a single hoarding. For example, the hoarding 102 may be a curved hoarding. Three capturing units with different focal views may be used to capture images of the entire display 103. The capturing unit 101 may be configured to capture the at least one real-time image for a predefined number of days and a pre-determined duration of time in each of the predefined number of days. For example, a display may be scheduled to be displayed between 01/01/2020 to 10/01/2020. The capturing unit 101 may be configured to capture the at least one real-time image of the display from 10:00 on 01/01/2020 to 18:00 on 10/01/2020. For each day between 01/01/2020 and 10/01/2020, the capturing unit 101 may be configured to capture the at least one real-time image from 10:00 to 18:00. In an embodiment, the predefined number of days may be selected based on number of days a product of the company is displayed in the hoarding. In an embodiment, the pre-determined duration of time may be daytime of a day, or busy hours in the location of the hoarding and so on. In an embodiment, the capturing unit 101 may be configured to capture the at least one real-time image of the display throughout the day.
[0021] The environment 100 shown in Figure 1B comprises a monitoring system 104, a cloud server 106, and an administrative unit 107. The monitoring system 104 may be used to monitor the display 103 on the hoarding 102. The monitoring system 104 comprises the capturing unit 101 and a processing unit 105. The capturing unit 101 and the processing unit 105 may be paired with each other. In an embodiment, the processing unit 105 may be a microcontroller. In another embodiment, the processing unit 105 may be an Internet of Things (IoT) device. The processing unit 105 may receive the at least one real-time image of the display 103 from the capturing unit 101. The processing unit 105 may compare the at least one real-time image of the display 103 with one or more pre-stored images of the display 103. The processing unit 105 may compare the at least one real-time image with the one or more pre-stored images. For example, the capturing unit 101 may capture a first real-time image on a second day of predefined ‘m’ days. The processing unit 105 may compare the first real-time image of a display of a company X, on a first hoarding with one or more pre-stored images of the first display on the second day. In an embodiment, the one or more pre-stored images may be stored in the cloud server 106. The processing unit 105 may be configured to retrieve the one or more pre-stored images from the cloud server 106, in real-time, for performing the comparison. In an embodiment, the at least one real-time image with the timestamp may be uploaded to the cloud server 106. In an embodiment, the processing unit 105 may be integral part of the cloud server 106. In such embodiment, the one or more pre-stored images and the at least one real-time image with the timestamp may be stored in the processing unit 105. In an embodiment, the processing unit 105 may be configured to provide the at least one real-time image with the timestamp to the cloud server 106, upon comparison. The at least one real-time captured by the capturing unit 101 at different instances may be stored in the cloud server 106. In an embodiment, the capturing unit 101 may be in direct communication with the cloud server 106 to provide the at least one real-time image of the display. In an embodiment, the monitoring system 104 may constantly be in communication with the cloud sever 106. In an embodiment, the monitoring system 104 may be in communication with the cloud server 106 for the pre-determined duration of time in the predefined number of days. The monitoring system 104 may communicate the at least one real-time image dynamically to the cloud server 106. In an embodiment, the monitoring system 104 may not be connected to the cloud server 106 for a certain time duration, due to network problem or power supply problem. The monitoring system 104 may receive multiple real-time images with corresponding timestamps of the display 103 and store the real-time images with corresponding timestamps in a memory, temporarily. The memory may be associated with one of the processing unit 105 and the capturing unit 101. The monitoring system 104 may transmit the real-time images with corresponding timestamps to the cloud server 106, upon establishing the connection with the cloud server 106.
[0022] Further, the processing unit 105 may be configured to provide an alert based on the comparison. The alert may be provided to the administrative unit 107. The administrative unit 107 may retrieve relevant data from the processing unit 105 via the cloud server 106. The administrative unit 107 may be configured to manage the display 103 on the hoarding 102. For example, the administrative unit 107 may be a marketing team of the company relating to the product displayed on the hoarding. In another example, the administrative unit 107 may be led by an individual and the alert may be received by the individual. The individual, for example, may be a brand manager. The administrative unit 107 may take necessary actions to manage the display 103 on the hoarding 102 based on the alerts.
[0023] Figure 1C shows the environment 100 illustrating pairing of the capturing unit 101 and the processing unit 105. The monitoring system 104, comprising the capturing unit 101 and the processing unit 105, configured to monitor the hoarding, may be dedicated to a single hoarding. In an embodiment, the monitoring system 104 may be configured to monitor multiple hoardings at different locations. In such case, each of the hoardings may be associated with corresponding capturing unit 101 and a single processing unit 105. The environment 100 comprises the administrative unit 107 managing the display 103 on the hoarding 102. The administrative unit 107 may be configured to manage a plurality of hoardings and respective monitoring system 104 at different locations. For example, the administrative unit 107 may be associated with a company. The administrative unit 107 may manage the plurality of hoardings of the company and the display 103 on the hoarding. An example of the administrative unit 107 managing the display 103 on the hoarding 102 at two locations M and N is illustrated in Figure 1C. The capturing unit 1011, a communication network 1081, and the processing unit 1051 may be associated with a display 1 on a hoarding 1 at location M. The capturing unit 1012, a communication network 1082, and the processing unit 1052 may be associated with a display 2 on a hoarding 2 at location N. In general, the capturing unit is referred as 101 in the present description. In general, the communication network is referred as 108 in the present description. In general, the processing unit is referred as 105 in the present description. The capturing unit 101 and the processing unit 105 may communicate over the communication network 108. The at least one real-time image with the timestamp, may be transmitted wirelessly over the communication network 108. The communication network 108 may be Wireless Fidelity (Wi-Fi), Bluetooth, Global System for Mobile communication (GSM), Zigbee and the like. In an embodiment, the at least one real-time image with the timestamp, may be transmitted from the processing unit 105 to the cloud server 106 over a wired medium. The wired medium may be twisted pair cable, coaxial cable, optic fibre, and the like. The processing unit 105 may be paired with the capturing unit 101 using respective Media Access Control (MAC) address. The pairing sets up a linkage between the capturing unit 101 and the processing unit 105 to allow communication between them. The MAC address may be a hardware address associated with each of the capturing unit 101 and the processing unit 105. The hardware address is associated with each device, to uniquely identify the device over the communication network 108. The capturing unit 1011, the processing unit 1051, the capturing unit 1012, and the processing unit 1052 are referred as devices A, B, C, and D, respectively. Each of the devices A, B, C and D are associated with a MAC address. The capturing unit 1011 and the processing unit 1051 communicating over the communication network 1081 are paired with MAC address 1 and MAC address 2, respectively. The MAC address is a 12-digit hexadecimal number. The MAC address 1 and the MAC address 2 are represented as WW:WW:WW:WW:WW:WW and XX:XX:XX:XX:XX:XX respectively, as shown in Figure 1C. The pairing may be controlled by the administrative unit 107. The capturing unit 101 and the processing unit 105 may be paired during installation of the hoarding 102. The administrative unit 107 may maintain an access control list 109. The access control list 109 may comprise the MAC addresses of authenticated devices in the communication network 1081. As shown in Figure 1C, the access control list 109 comprises the MAC addresses of the capturing unit 1011 and the processing unit 1051. Similarly, the access control list 109 comprises the MAC addresses of the 1012 and the processing unit 1052 i.e., YY:YY:YY:YY:YY:YY and ZZ:ZZ:ZZ:ZZ:ZZ:ZZ respectively. In some scenarios, the capturing unit 1011 and the processing unit 1051 may be unpaired. Unpairing may be due to network problem, movement of one of the capturing unit 1011 and the processing unit 1051 away from a range of the communication network 1081, manual unpairing and the like. The capturing unit 1011 and the processing unit 1051 are disabled when unpaired. The administrative unit 107 may also ensure that no other devices can be paired with the capturing unit 1011 and the processing unit 1051, using the access control list 109. This ensures security and avoids tampering of the capturing unit 1011 and the processing unit 1051. Also, the capturing unit 1011 and the processing unit 1051 is geo-tagged with the location M of the hoarding 1. At least one of the capturing unit 1011 and the processing unit 1051 is disabled when real-time location of at least one of the capturing unit 1011 and the processing unit 1051 is different from the location M. Similarly, the capturing unit 1012 and the processing unit 1052 is geo-tagged with the location N of the hoarding 2. The administrative unit 107 may maintain geo-tagged location information 110 for the hoardings 1 and 2. For example, twenty hoardings of company X may be installed at various locations. The administrative unit 107 may be a department of the company X. The department may maintain the access control list 109 with the MAC addressees of capturing units and processing units associated with the twenty hoardings and the geo-tagged location information 110.
[0024] Figure 2 illustrates internal architecture (200) of the processing unit 105 in accordance with some embodiments of the present disclosure. The processing unit 105 may be used to monitoring the display 103 on the hoarding 102 at the location. The processing unit 105 may include at least one Central Processing Unit 203 (also referred as “CPU” or “processor”) and a memory 202 storing instructions executable by the at least one processor 203. The at least one processor 203 may comprise at least one data processor for executing program components for executing user or system-generated requests. The memory 202 may be communicatively coupled to the at least one processor 203. The memory 202 stores instructions, executable by the at least one processor 203, which, on execution, may cause the at least one processor 203 to receive the at least one real-time image with the timestamp, compare the at least one real-time image with the one or more pre-stored images, and provide the alert based on the comparison, as disclosed in the present disclosure. In an embodiment, the memory 202 may include one or more modules 205 and data 204. The one or more modules 205 may be configured to perform the steps of the present disclosure using the data 204, to receive the at least one real-time image with the timestamp, compare the at least one real-time image with the one or more pre-stored images, and provide the alert based on the comparison. In an embodiment, each of the one or more modules 205 may be a hardware unit which may be outside the memory 202 and coupled with the processing unit 105. The processing unit 105 further comprises an Input/ Output (I/O) interface 201. The I/O interface 201 is coupled with the at least one processor 203 through which an input signal or/and an output signal is communicated. For example, the processing unit 105 may receive the at least one real-time image from the capturing unit 101 via the I/O interface 201. Also, the processing unit 105 may transmit the alert to the administrative unit 107 and the at least one real-time image to the cloud server 106 via the I/O interface 201. In an embodiment, the processing unit 105, for receiving the at least one real-time image with the timestamp, comparing the at least one real-time image with the one or more pre-store images, and providing the alert based on the comparison, may be implemented in a variety of computing systems, such as a laptop computer, a desktop computer, a Personal Computer (PC), a notebook, a smartphone, a tablet, e-book readers, a server, a network server, a cloud-based server and the like.
[0025] In one implementation, the one or more modules 205 may include, for example, a real-time image input module 213, a comparison module 214, a notification module 215, and other modules 216. It will be appreciated that such aforementioned modules 205 may be represented as a single module or a combination of different modules.
[0026] In an embodiment, the data 204 may be stored within the memory 202. The data 204 may include, for example, real-time image data 206, pre-stored image data 207, comparison data 208, alert data 209, pairing data 210, location data 211 and other data 212.
[0027] In an embodiment, the data 204 in the memory 202 may be processed by the one or more modules 205 of the processing unit 105. As used herein, the term one or more modules 205 refers to an Application Specific Integrated Circuit (ASIC), an electronic circuit, a Field-Programmable Gate Arrays (FPGA), a Programmable System-on-Chip (PSoC), a combinational logic circuit, and/or other suitable components that provide the described functionality. The one or more modules 205 when configured with the functionality defined in the present disclosure will result in a novel hardware.
[0028] In an embodiment, the real-time image input module 213 may receive the at least one real-time image of the display 103 on the hoarding 102. The capturing unit 101 may capture the at least one real-time image with the timestamp, for the pre-determined duration of time in each of the predefined number of days. The capturing unit 101 may dynamically provide the at least one real-time image with the timestamp to the processing unit 105. The real-time image input module 213 may receive the at least one real-time image with the timestamp, for the pre-determined duration of time in each of the predefined number of days. For example, the display 103 may be intended on the hoarding from day 1 to day 10. The real-time image input module 213 may receive the at least one real-time image at 10.00 AM and 8 PM. In an embodiment, the real-time image input module 213 may pre-process the at least one real-time image. The pre-processing may include, but is not limited to, compressing the data, removing noises, normalizing, analog to digital conversion, changing format and the like. The at least one real-time image with the timestamp may be stored as the real-time image data 206 in the memory 202 of the processing unit 105. The at least one real-time image is referred as the real-time image 206 hereafter, in the present description.
[0029] In an embodiment, the comparison module 214 may receive the real-time image data 206 with the timestamp. The comparison module 214 may be configured to compare the at least one real-time image 206 with the one or more pre-stored images 207 of the display 103 on the hoarding 102. The comparison module 214 may determine if the timestamp associated with the at least one real-time image is within the predefined number of days. The one or more pre-stored images of the display 103 and information relating to the predefined number of days may be stored in the cloud server 106. The comparison module 214 may retrieve the one or more pre-stored images of the display 103 and information relating to the predefined number of days from the cloud server 106. In an embodiment, the one or more pre-stored images may be stored as the pre-stored image data. The one or more pre-stored images is referred as the one or more pre-stored images 207 hereafter, in the present description. For example, the comparison module 214 may compare a real-time image of a first display with one or more pre-stored images of the first display on 02/09/2020 second day of ‘m’ predefined days (01/09/2020-20/09/2020). The timestamp associated with the real-time image may be 12:00 02/09/2020.The comparison module 214 may determine that the timestamp is within the ‘m’ predefined days (01/09/2020-20/09/2020). Similarly, the comparison module 214 may compare a real-time image of a second display with one or more pre-stored images of the second display. The comparison module 214 may be further configured to identify presence of a damage in the display 103 on the hoarding 102. Further, the comparison module 214 may be configured to identify the real-time image 206 of the display 103 on the hoarding 102 being different from the one or more pre-stored images 207. The comparison module 214 may identify the damage in the display 103 by identifying difference in set of features associated with the at least one real-time image 206 and the one or more pre-stored images 207. The comparison module 214 may identify the real-time image 206 of the display 103 on the hoarding 102 being different from the one or more pre-stored images 207, when a maximum or all of the set of features of the real-time image 206 are different from that of the one or more pre-stored images 207. In an embodiment, the comparison module 214 may provide one or more alerts directly based on the comparison. The one or more alerts may be based on result of the comparison. For example, the one or more alerts may be based on an output value based on similarity of images, obtained using an image comparison technique. In an embodiment, the comparison module 214 may provide output signals. The one or more alerts may be provided based on the output signals from the comparison module 214. In one example, the comparison module 214 may provide an output of value “-1” when the real-time image 206 of the display 103 on the hoarding 102 is different from the one or more pre-stored images 207. The comparison module 214 may provide an output of value “0” when the damage of the display 103 is identified. The comparison module 214 may provide an output of value “1” when the at least one real-time image 206 and the one or more pre-stored images 207 is same. In another example, no output from the comparison module 214 may indicate that the one real-time image and the one or more pre-stored images 207 is same. A person skilled in the art will appreciate that signals of different values and in different representations may be provided to indicate the output of the comparison. The output of the comparison module 214 may be stored as the comparison data 208 in the memory 202 of the processing unit 105.
[0030] In an embodiment, the notification module 215 may receive the comparison data 208. The notification module 215 may be configured to provide an alert to manage the display 103 on the hoarding 102 at the location, based on the comparison data 208. In an embodiment, the notification module 215 may be configured to provide the alert to manage the display 103 on the hoarding 102, to the administrative unit 107. In another embodiment, the notification module 215 may be configured to provide a plurality of alerts based on the comparison. For example, the notification module 215 may provide a first alert and a second alert when the output from the comparison module is -1 and 0, respectively. The alert may be provided in one or more forms, known to a person skilled in the art. For example, the alert may be a text. In another example, the alert may be a sound signal. The administrative unit 107 may take the necessary actions to manage the display 103 on the hoarding 102 based on the alerts. The one or more alerts may be stored as the alert data 209 in the memory 202 of the processing unit 105.The alert data 209 may be used by the administrative unit 107 to keep a record for managing the display 103 on the hoarding 102. For example, a first display may be an intended display X. A second display may refer to a different display Y. A first alert indicating display of Y may be transmitted to the administrative unit 107 at a time T1 of a day. Upon manually checking the hoarding 102 at time T2 of the day, X may be observed. Again, the first alert may be received at time T3 of the day indicating the display of Y. The one or more alerts may be used accordingly to manage payment relating to the display 103. One or more alerts and an alert is used interchangeably in the present description.
[0031] In an embodiment, the pairing data 210 may comprise the MAC addressees of the processing unit 105 and the capturing unit 101. The pairing data 210 may further comprise a device name and device identification number of the capturing unit 101. In an embodiment, the location data 211 may comprise the location used for geo-tagging. The pairing data 210 and the location data 211 may be used to enable and disable the processing unit 105 and the capturing unit 101. For example, the location data 211 may be updated based on a real-time location. A change in the location data 211 may provide an indication to the administrative unit 107 to disable at least one of the processing unit 105 and the capturing unit 101. In another example, the pairing data 210 may have a data indicating the pairing of the processing unit 105 and the capturing unit 101. A change in the pairing data 210 when the processing unit 105 and the capturing unit 101 are unpaired may result in disabling the capturing unit 101. Further, the change in the pairing data 210 may provide an indication to the administrative unit 107 to disable the processing unit 105. In an embodiment, the capturing unit 101 and the processing unit 105 may be automatically disabled upon the unpairing or a change in the real-time location of the capturing unit 101 and the processing unit 105.
[0032] The other data 212 may store data, including temporary data and temporary files, generated by the one or more modules 205 for performing the various functions of the processing unit 105. The one or more modules 205 may also include the other modules 216 to perform various miscellaneous functionalities of the processing unit 105. It will be appreciated that such modules may be represented as a single module or a combination of different modules.
[0033] Figure 3 shows an exemplary flow chart illustrating method steps for monitoring the display 103 on the hoarding 102 at the location, in accordance with some embodiments of the present disclosure. As illustrated in Figure 3, the method 300 may comprise one or more steps. The method 300 may be described in the general context of computer executable instructions. Generally, computer executable instructions can include routines, programs, objects, components, data structures, procedures, modules, and functions, which perform particular functions or implement particular abstract data types.
[0034] The order in which the method 300 is described is not intended to be construed as a limitation, and any number of the described method blocks can be combined in any order to implement the method. Additionally, individual blocks may be deleted from the methods without departing from the scope of the subject matter described herein. Furthermore, the method can be implemented in any suitable hardware, software, firmware, or combination thereof.
[0035] At step 301, the capturing unit 101 may be configured to capture the at least one real-time image 206 of the display 103 on the hoarding 102 at the location The capturing unit 101 may capture the at least one real-time image 206 with the timestamp. In an embodiment, the capturing unit 101 may capture the at least one real-time image 206 for a pre-determined duration of time in each of the predefined number of days. For example, the capturing unit 101 may capture one real-time image at morning and one real-time image at evening. In another example, the capturing unit 101 may capture the at least one real-time image every one hour. The capturing unit 101 may be configured to operate on different modes. For example, the capturing unit 101 may be configured to operate on High Dynamic Range (HDR) mode to widen exposure range. The capturing unit 101 may be configured to operate on different modes at morning and night to ensure capturing with proper lighting conditions. The capturing unit 101 may be configured to capture the at least one real-time image 206 with the timestamp. The timestamp may have date and time at which the at least one real-time image is captured. For example, the timestamp associated with the at least one real-time image 206 may be 21/09/2020 12:00. Figure 4A illustrates an embodiment of the cloud server 106 storing the one or more pre-stored images of the display 103 and the information relating to the predefined number of days. The one or more pre-stored images and the predefined number of days is referred as 400. 401 is a pre-stored image of a first display, in the cloud server 106. The first display of Y brand and X model is shown. 402 shows the predefined number of days. The first display is scheduled for displaying on a first hoarding at location L from 10:00 on 01/09/2020 to 18.00 on 20/09/2020. 403 shows a pre-stored image of a second display scheduled for displaying on the first hoarding. 404 shows the predefined number of days. The second display is scheduled for displaying at the location L from 10:00 on 21/09/2020 to 18:00 on 01/10/2020. Referring to example of Figure 4B, 405 shows the real-time images of the first display stored in the cloud server 106. 406 shows a real-time image of the first display captured at the timestamp 11:00 on 07/09/2020. Similarly, 407 shows another real-time image of the first display captured at the timestamp 11:00 on 15/09/2020.
[0036] At step 302, the processing unit 105 may receive the at least one real-time image 206 with the timestamp, from the capturing unit 101. The processing unit 105 may receive the at least one real-time image from the processing unit 105 over the communication network 108. The processing unit 105 may be paired with the capturing unit 101 using respective Media Access Control (MAC) address. The paired capturing and processing unit is geo-tagged with the location of the hoarding 102. The pairing and the geo-tagging may be performed during installation of the hoarding 102. Each of the processing unit 105 and the capturing unit 101 is configured to operate based on the pairing and the geo-tagging, for monitoring the hoarding 102. Referring to the example illustrated in Figure 4B, 406 and 407 may be the real-time images received by the processing unit 105 from the capturing unit 101.
[0037] At step 303, The processing unit 105 may compare the at least one real-time image 206 of the display 103 with the one or more pre-stored images of the display 103. In an example, the one or more pre-stored images may be images of the display 103 captured from different angles. In another example, the one or more pre-stored images of the display 103 may be frames from a captured video of the display 103. The processing unit 105 may compare the at least one real-time image 206 with the one or more pre-stored images to identify if the display 103 intended for the predefined days is displayed. The processing unit 105 may be configured to identify presence of the damage in the display 103 on the hoarding 102. Further, the processing unit 105 may be configured to identify the real-time image 206 of the display 103 on the hoarding 102 being different from the one or more pre-stored images 207. The processing unit 105 may identify the damage of the display 103 by identifying difference in set of features associated with the at least one real-time image 206 and the one or more pre-stored images 207. The processing unit 105 may identify the real-time image 206 of the display 103 on the hoarding 102 being different from the one or more pre-stored images 207, when a maximum or all of the features of the real-time image 206 are different from that of the one or more pre-stored images 207. Referring to the example of Figure 4B, the processing unit 105 may compare the real-time image 406 with the pre-stored image 401. The result of the comparison may be 1, since both the real-time image 406 and the pre-stored image 401 are same. Similarly, the result of the comparison may be 1, when the real-time image 407 is processed. Referring to an example of Figure 4C, a real-time image 408 is received with the timestamp 16:00 18/09/2020 from the capturing unit 101. The processing unit 105 may identify that all the features of the real-time image 408 and the pre-stored image 401 is different. The processing unit 105 may identify that the real-time image 206 of the first display 103 being different from the pre-stored image. The result of the comparison may be -1. Referring to an example of Figure 4D, a real-time image 409 is received from the capturing unit 101 with the timestamp 16:00 18/09/2020. The processing unit 105 may identify some common missing features of the real-time image 409 and the pre-stored image 401. The processing unit 105 may identify the damage of the first display. The result of the comparison may be 0. In an embodiment, pixel-by pixel comparison may be used to compare the at least one real-time image 206 of the display 103 with the one or more pre-stored images of the display 103. A person skilled in the art will appreciate that any other image comparison techniques to compare two images may be used.
[0038] At step 304, the processing unit 105 may provide one or more alerts based on the comparison. The alert may be a notification provided when the display is different from the intended display or the display is damaged. For example, the processing unit 105 may provide a first alert and a second alert when the output is -1 and 0, respectively. In an embodiment, the one or more alerts may be provided directly based on the comparison. In another embodiment, the output of the comparison may be in form of values -1, 0, 1. The one or more alerts may be provided based on the values. The alert may be provided in any forms. For example, the alert may be a text. In another example, the alert may be a sound signal. In an example, the alert may be a sound when the at least one real-time image 206 is different from the one or more pre-stored images 207. The alert may be a text when there is a damage to the display on the hoarding. In another example, when the damage is on a second day from when it is displayed, the alert may be a sound. When the damage is on a 19th day and a different display is scheduled from 20th day, the alert may be a text. The administrative unit 107 may take the necessary actions to manage the display 103 on the hoarding 102 based on the one or more alerts received via the cloud server 106. The one or more alerts may be used to resolve the damage of the display 103. Also, the one or more alerts may be used accordingly to manage payment relating to the display 103. For example, a display D of a company X may be scheduled from 01/01/2020 to 10/01/2020. A different display may be displayed from 06/01/2020 to 10/01/2020. The alerts may be received at different timestamps on each of these days. The administrative unit 107 may identify the different display displayed on a hoarding on these days based on the alerts. The at least one real-time image with corresponding timestamp may be analysed by the administrative unit 107 to determine time and day when the different display is displayed. The administrative unit 107 may make payment for the display from 01/01/2020 to 05/01/2020. The administrative unit 107 may also manage resolving of damage of the display 103 based on the timestamp. For example, when the alert is received indicating the damage of the display 103, the timestamp may indicate a date which is closer to display installation date. For example, the display 103 may be damaged on third day from when the display 103 is displayed. In such case, the administrative unit 107 may initiate to resolve the damage. When the date is last day or previous day to a day of change of the display 103, the administrative unit 107 may consider ignoring the damage and not initiate to resolve. In an embodiment, the alerts may be stored in the cloud server 106 in form of texts as a record. The record may be used by the administrative unit 107 for analysing the monitoring of the display 103 on the hoarding 102. In an embodiment, the processing unit 105 may monitor focus lights installed on the hoarding 102. Monitoring of the focus lights may include to check if the focus lights are operating fine to illuminate light on the hoarding. The processing unit 105 may provide an alert based on the monitoring. For example, four focus lights may be installed on a hoarding. The processing unit 105 may monitor the focus lights during night. The processing unit 105 may identify two focus lights are not operating. The processing unit 105 may provide the alert based on the identification. Monitoring of the focus lights on the hoarding may ensure clarity of the at least one real-time image 206 captured by the capturing unit 101.
[0039] In an embodiment, a central dashboard may present information to check a plurality of hoardings that are installed at various locations. The central dashboard may be a portal that may be viewed by different administrative units of different companies. The administrative units may use information on the central dashboard for monitoring, statistical analysis of the plurality of hoardings and the like. Information of the plurality of hoardings may be categorized across a country, a region, a city, and the like. Further, the categorization may be provided based on brands, different hoardings for same brand, hoarding size and cost involved and agency wise segregation. Figure 5 illustrates a central dashboard 500 to present brand-wise hoarding information and corresponding location information. For example, the central dashboard 500 is showing hoarding 1 and hoarding 3 with respective displays (image 1 and image 3) and locations 1 and 3 on a map. Similarly, the central dashboard 500 is showing hoarding 2 with display (image 2) and location 2 on the map. In an embodiment, the central dashboard 500 may be used to identify type of the damage of the display 103 on the hoarding 102. The types of damage of the display 102 may comprise natural damage and man-made damage. The type of the damage may be identified based on number of hoardings damaged in a particular geographical location. For example, an alert may be received by the administrative unit 107, indicating a damage of a first display on a first hoarding in a location ‘L’. The administrative unit 107 may monitor if any other hoardings are damaged in the location ‘L’ from information in the central dashboard 500. The central dashboard 500 may have location-wise hoarding information. The administrative unit 107 may identify that multiple hoardings in the location ‘L’ have been damaged. In such case, the administrative unit 107 may identify the type of the damage to be natural damage. In another example, the administrative unit 107 may identify that the first hoarding is damaged, but a second hoarding adjacent to the first hoarding is not damaged. In such case, the administrative unit 107 may identify the type of the damage as man-made damage. The administrative unit 107 may take appropriate actions accordingly.
COMPUTER SYSTEM
[0040] Figure 6 illustrates a block diagram of an exemplary computer system 600 for implementing embodiments consistent with the present disclosure. In an embodiment, the computer system 600 may be used to implement the processing unit 105. Thus, the computer system 600 may be used to receive the at least one real-time image with the timestamp, compare the at least one real-time image with the one or more pre-store images, and provide the alert based on the comparison. The computer system 600 may comprise a Central Processing Unit 602 (also referred as “CPU” or “processor”). The processor 602 may comprise at least one data processor. The processor 602 may include specialized processing units such as integrated system (bus) controllers, memory management control units, floating point units, graphics processing units, digital signal processing units, etc.
[0041] The processor 602 may be disposed in communication with one or more input/output (I/O) devices (not shown) via I/O interface 601. The I/O interface 601 may employ communication protocols/methods such as, without limitation, audio, analog, digital, monoaural, RCA, stereo, IEEE (Institute of Electrical and Electronics Engineers) -1394, serial bus, universal serial bus (USB), infrared, PS/2, BNC, coaxial, component, composite, digital visual interface (DVI), high-definition multimedia interface (HDMI), Radio Frequency (RF) antennas, S-Video, VGA, IEEE 802.n /b/g/n/x, Bluetooth, cellular (e.g., code-division multiple access (CDMA), high-speed packet access (HSPA+), global system for mobile communications (GSM), long-term evolution (LTE), WiMax, or the like), etc.
[0042] Using the I/O interface 601, the computer system 600 may communicate with one or more I/O devices. For example, the input device 610 may be an antenna, keyboard, mouse, joystick, (infrared) remote control, camera, card reader, fax machine, dongle, biometric reader, microphone, touch screen, touchpad, trackball, stylus, scanner, storage device, transceiver, video device/source, etc. The output device 611 may be a printer, fax machine, video display (e.g., cathode ray tube (CRT), liquid crystal display (LCD), light-emitting diode (LED), plasma, Plasma display panel (PDP), Organic light-emitting diode display (OLED) or the like), audio speaker, etc.
[0043] The computer system 600 is connected to a capturing unit 612 through a communication network 609. The capturing unit 612 is used to provide at least one real-time image of the display to the computer system 600. The processor 602 may be disposed in communication with the communication network 609 via a network interface 603. The network interface 603 may communicate with the communication network 609. The network interface 603 may employ connection protocols including, without limitation, direct connect, Ethernet (e.g., twisted pair 10/100/1000 Base T), transmission control protocol/internet protocol (TCP/IP), token ring, IEEE 802.11a/b/g/n/x, etc. The communication network 609 may include, without limitation, a direct interconnection, local area network (LAN), wide area network (WAN), wireless network (e.g., using Wireless Application Protocol), the Internet, etc. The network interface 603 may employ connection protocols include, but not limited to, direct connect, Ethernet (e.g., twisted pair 10/100/1000 Base T), transmission control protocol/internet protocol (TCP/IP), token ring, IEEE 802.11a/b/g/n/x, etc.
[0044] The communication network 609 includes, but is not limited to, a direct interconnection, an e-commerce network, a peer to peer (P2P) network, local area network (LAN), wide area network (WAN), wireless network (e.g., using Wireless Application Protocol), the Internet, Wi-Fi, and such. The first network and the second network may either be a dedicated network or a shared network, which represents an association of the different types of networks that use a variety of protocols, for example, Hypertext Transfer Protocol (HTTP), Transmission Control Protocol/Internet Protocol (TCP/IP), Wireless Application Protocol (WAP), etc., to communicate with each other. Further, the first network and the second network may include a variety of network devices, including routers, bridges, servers, computing devices, storage devices, etc.
[0045] In some embodiments, the processor 602 may be disposed in communication with a memory 605 (e.g., RAM, ROM, etc. not shown in Figure 6) via a storage interface 604. The storage interface 604 may connect to memory 605 including, without limitation, memory drives, removable disc drives, etc., employing connection protocols such as serial advanced technology attachment (SATA), Integrated Drive Electronics (IDE), IEEE-1394, Universal Serial Bus (USB), fiber channel, Small Computer Systems Interface (SCSI), etc. The memory drives may further include a drum, magnetic disc drive, magneto-optical drive, optical drive, Redundant Array of Independent Discs (RAID), solid-state memory devices, solid-state drives, etc.
[0046] The memory 605 may store a collection of program or database components, including, without limitation, user interface 606, an operating system 607, web browser 608 etc. In some embodiments, computer system 600 may store user/application data, such as, the data, variables, records, etc., as described in this disclosure. Such databases may be implemented as fault-tolerant, relational, scalable, secure databases such as Oracle ® or Sybase®.
[0047] The operating system 607 may facilitate resource management and operation of the computer system 600. Examples of operating systems include, without limitation, APPLE MACINTOSHR OS X, UNIXR, UNIX-like system distributions (E.G., BERKELEY SOFTWARE DISTRIBUTIONTM (BSD), FREEBSDTM, NETBSDTM, OPENBSDTM, etc.), LINUX DISTRIBUTIONSTM (E.G., RED HATTM, UBUNTUTM, KUBUNTUTM, etc.), IBMTM OS/2, MICROSOFTTM WINDOWSTM (XPTM, VISTATM/7/8, 10 etc.), APPLER IOSTM, GOOGLER ANDROIDTM, BLACKBERRYR OS, or the like.
[0048] In some embodiments, the computer system 600 may implement the web browser 608 stored program component. The web browser 608 may be a hypertext viewing application, for example MICROSOFTR INTERNET EXPLORERTM, GOOGLER CHROMETM0, MOZILLAR FIREFOXTM, APPLER SAFARITM, etc. Secure web browsing may be provided using Secure Hypertext Transport Protocol (HTTPS), Secure Sockets Layer (SSL), Transport Layer Security (TLS), etc. Web browsers 608 may utilize facilities such as AJAXTM, DHTMLTM, ADOBER FLASHTM, JAVASCRIPTTM, JAVATM, Application Programming Interfaces (APIs), etc. In some embodiments, the computer system 600 may implement a mail server (not shown in Figure) stored program component. The mail server may be an Internet mail server such as Microsoft Exchange, or the like. The mail server may utilize facilities such as ASPTM, ACTIVEXTM, ANSITM C++/C#, MICROSOFTR, .NETTM, CGI SCRIPTSTM, JAVATM, JAVASCRIPTTM, PERLTM, PHPTM, PYTHONTM, WEBOBJECTSTM, etc. The mail server may utilize communication protocols such as Internet Message Access Protocol (IMAP), Messaging Application Programming Interface (MAPI), MICROSOFTR exchange, Post Office Protocol (POP), Simple Mail Transfer Protocol (SMTP), or the like. In some embodiments, the computer system 600 may implement a mail client stored program component. The mail client (not shown in Figure) may be a mail viewing application, such as APPLER MAILTM, MICROSOFTR ENTOURAGETM, MICROSOFTR OUTLOOKTM, MOZILLAR THUNDERBIRDTM, etc.
[0049] Furthermore, one or more computer-readable storage media may be utilized in implementing embodiments consistent with the present disclosure. A computer-readable storage medium refers to any type of physical memory on which information or data readable by a processor may be stored. Thus, a computer-readable storage medium may store instructions for execution by one or more processors, including instructions for causing the processor(s) to perform steps or stages consistent with the embodiments described herein. The term “computer-readable medium” should be understood to include tangible items and exclude carrier waves and transient signals, i.e., be non-transitory. Examples include Random Access Memory (RAM), Read-Only Memory (ROM), volatile memory, non-volatile memory, hard drives, Compact Disc Read-Only Memory (CD ROMs), Digital Video Disc (DVDs), flash drives, disks, and any other known physical storage media.
[0050] Embodiments of the present disclosure enhances security of a monitoring system used for monitoring a display on a hoarding, by pairing and geo-tagging components of the monitoring system.
[0051] Embodiments of the present disclosure ensures monitoring of the display by avoiding tampering of the components of the monitoring system.
[0052] Embodiments of the present disclosure identifies damage of the display. Hence, the damage may be resolved by administrative unit.
[0053] Embodiments of the present disclosure identifies number of days the intended display is displayed. Hence, payment process between a company and a hoarding agency can be easily managed.
[0054] The terms "an embodiment", "embodiment", "embodiments", "the embodiment", "the embodiments", "one or more embodiments", "some embodiments", and "one embodiment" mean "one or more (but not all) embodiments of the invention(s)" unless expressly specified otherwise.
[0055] The terms "including", "comprising", “having” and variations thereof mean "including but not limited to", unless expressly specified otherwise.
[0056] The enumerated listing of items does not imply that any or all of the items are mutually exclusive, unless expressly specified otherwise. The terms "a", "an" and "the" mean "one or more", unless expressly specified otherwise.
[0057] A description of an embodiment with several components in communication with each other does not imply that all such components are required. On the contrary a variety of optional components are described to illustrate the wide variety of possible embodiments of the invention.
[0058] When a single device or article is described herein, it will be readily apparent that more than one device/article (whether or not they cooperate) may be used in place of a single device/article. Similarly, where more than one device or article is described herein (whether or not they cooperate), it will be readily apparent that a single device/article may be used in place of the more than one device or article or a different number of devices/articles may be used instead of the shown number of devices or programs. The functionality and/or the features of a device may be alternatively embodied by one or more other devices which are not explicitly described as having such functionality/features. Thus, other embodiments of the invention need not include the device itself.
[0059] The illustrated operations of Figure 4 and 5 shows certain events occurring in a certain order. In alternative embodiments, certain operations may be performed in a different order, modified, or removed. Moreover, steps may be added to the above described logic and still conform to the described embodiments. Further, operations described herein may occur sequentially or certain operations may be processed in parallel. Yet further, operations may be performed by a single processing unit or by distributed processing units.
[0060] Finally, the language used in the specification has been principally selected for readability and instructional purposes, and it may not have been selected to delineate or circumscribe the inventive subject matter. It is therefore intended that the scope of the invention be limited not by this detailed description, but rather by any claims that issue on an application based here on. Accordingly, the disclosure of the embodiments of the invention is intended to be illustrative, but not limiting, of the scope of the invention, which is set forth in the following claims.
[0061] While various aspects and embodiments have been disclosed herein, other aspects and embodiments will be apparent to those skilled in the art. The various aspects and embodiments disclosed herein are for purposes of illustration and are not intended to be limiting, with the true scope being indicated by the following claims.
Referral Numerals:
Referral Number Description
101 Capturing unit
102 Hoarding
103 Display
104 Monitoring system
105 Processing unit
106 Cloud server
107 Administrative unit
108 Communication network
201 I/O interface
202 Memory
203 Processor
204 Data
205 Modules
206 Real-time image data
207 Pre-stored image data
208 Comparison data
209 Alert data
210 Pairing data
211 Location data
212 Other data
213 Real-time image input module
214 Comparison module
215 Notification module
216 Other modules
600 Computer system
601 I/O interface
602 Processor
603 Network interface
604 Storage interface
605 Memory
606 User interface
608 Web browser
609 Communication network
610 Input device
611 Output device
612 Capturing unit
,CLAIMS:We claim:
1. A monitoring system (104) for monitoring a display (103) on a hoarding (102) at a location, the monitoring system (104) comprising:
a capturing unit (101) configured to capture at least one real-time image (206) of a display (103) on a hoarding (102) at a location, wherein the at least one real-time image (206) is captured with a timestamp, for a predefined number of days; and
a processing unit (105), paired with the capturing unit (101) using respective Media Access Control (MAC) address, configured to:
receive the at least one real-time image (206) with the timestamp, from the capturing unit (101);
compare the at least one real-time image (206) with one or more pre-stored images (207) of the display (103) on the hoarding (102); and
provide an alert to manage the display (103) on the hoarding (102) at the location, based on the comparison,
wherein paired capturing and processing unit is geo-tagged with the location of the hoarding (102), wherein each of the processing unit (105) and the capturing unit (101) is configured to operate based on the pairing and the geo-tagging, for monitoring the hoarding (102).
2. The monitoring system (104) as claimed in claim 1, wherein the capturing unit (101) and the processing unit (105) is disabled when the capturing unit (101) and the processing unit (105) are unpaired.
3. The monitoring system (104) as claimed in claim 1, wherein at least one of the capturing unit (101)and the processing unit (105) is disabled when real-time location of at least one of the capturing unit (101)and the processing unit (105) is different from the location used for the geo-tagging.
4. The monitoring system (104) as claimed in claim 1, wherein monitoring of the display (103) on the hoarding (102) is performed for a pre-determined duration of time in each of the predefined number of days.
5. The monitoring system (104) as claimed in claim 1, wherein the processing unit (105) is further configured to, based on the comparison, identify, at least one of:
presence of a damage in the display (103) on the hoarding (102); and
the real-time image (206) of the display (103) on the hoarding (102) being different from the one or more pre-stored images (207).
6. A method for monitoring a display (103) on a hoarding (102) at a location using a monitoring system (104) comprising a capturing unit (101) and a processing unit (105) paired with the capturing unit (101), using respective Media Access Control (MAC) address, the method comprising:
capturing, by the capturing unit (101), at least one real-time image (206) of a display (103) on a hoarding (102) at a location, wherein the at least one real-time image (206) is captured with a timestamp, for a predefined number of days; and
receiving, by the processing unit (105), the at least one real-time image (206) with the timestamp, from the capturing unit (101);
comparing, by the processing unit (105), the at least one real-time image (206) with one or more pre-stored images (207) of the display (103) on the hoarding (102); and
providing, by the processing unit (105), an alert to manage the display (103) on the hoarding (102) at the location, based on the comparison,
wherein paired capturing and processing unit is geo-tagged with the location of the hoarding (102), wherein each of the processing unit (105) and the capturing unit (101) is configured to operate based on the pairing and the geo-tagging, for monitoring the hoarding (102).
7. The method as claimed in claim 6, comprises disabling the capturing unit (101) and the processing unit (105), when the capturing unit (101) and the processing unit (105) are unpaired.
8. The method as claimed in claim 6, comprises disabling at least one of the capturing unit (101) and the processing unit (105), when real-time location of at least one of the capturing unit (101)and the processing unit (105) is different from the location used for the geo-tagging.
9. The method as claimed in claim 6, wherein monitoring of the display (103) on the hoarding (102) is performed for a pre-determined duration of time in each of the predefined number of days.
10. The method as claimed in claim 6, further comprises identifying, based on the comparison, at least one of:
presence of a damage in the display (103) on the hoarding (102); and
the real-time image (206) of the display (103) on the hoarding (102) being different from the one or more pre-stored images (207).
Dated this on September 23rd, 2020
R. RAMYA RAO
OF K&S PARTNERS
AGENT FOR THE APPLICANT(S)
IN/PA- 1607
| Section | Controller | Decision Date |
|---|---|---|
| # | Name | Date |
|---|---|---|
| 1 | 201941038257-IntimationOfGrant26-03-2024.pdf | 2024-03-26 |
| 1 | 201941038257-STATEMENT OF UNDERTAKING (FORM 3) [23-09-2019(online)].pdf | 2019-09-23 |
| 2 | 201941038257-PatentCertificate26-03-2024.pdf | 2024-03-26 |
| 2 | 201941038257-PROVISIONAL SPECIFICATION [23-09-2019(online)].pdf | 2019-09-23 |
| 3 | 201941038257-Written submissions and relevant documents [18-03-2024(online)].pdf | 2024-03-18 |
| 3 | 201941038257-POWER OF AUTHORITY [23-09-2019(online)].pdf | 2019-09-23 |
| 4 | 201941038257-FORM 1 [23-09-2019(online)].pdf | 2019-09-23 |
| 4 | 201941038257-Correspondence to notify the Controller [28-02-2024(online)].pdf | 2024-02-28 |
| 5 | 201941038257-FORM-26 [26-02-2024(online)].pdf | 2024-02-26 |
| 5 | 201941038257-DRAWINGS [23-09-2019(online)].pdf | 2019-09-23 |
| 6 | 201941038257-US(14)-HearingNotice-(HearingDate-04-03-2024).pdf | 2024-02-17 |
| 6 | 201941038257-DECLARATION OF INVENTORSHIP (FORM 5) [23-09-2019(online)].pdf | 2019-09-23 |
| 7 | 201941038257-FORM 13 [18-09-2020(online)].pdf | 2020-09-18 |
| 7 | 201941038257-FER_SER_REPLY [22-07-2022(online)].pdf | 2022-07-22 |
| 8 | 201941038257-PETITION UNDER RULE 137 [22-07-2022(online)].pdf | 2022-07-22 |
| 8 | 201941038257-DRAWING [23-09-2020(online)].pdf | 2020-09-23 |
| 9 | 201941038257-CORRESPONDENCE-OTHERS [23-09-2020(online)].pdf | 2020-09-23 |
| 9 | 201941038257-FORM 4(ii) [30-05-2022(online)].pdf | 2022-05-30 |
| 10 | 201941038257-COMPLETE SPECIFICATION [23-09-2020(online)].pdf | 2020-09-23 |
| 10 | 201941038257-FER.pdf | 2021-12-02 |
| 11 | 201941038257-FORM 18 [05-10-2020(online)].pdf | 2020-10-05 |
| 12 | 201941038257-COMPLETE SPECIFICATION [23-09-2020(online)].pdf | 2020-09-23 |
| 12 | 201941038257-FER.pdf | 2021-12-02 |
| 13 | 201941038257-CORRESPONDENCE-OTHERS [23-09-2020(online)].pdf | 2020-09-23 |
| 13 | 201941038257-FORM 4(ii) [30-05-2022(online)].pdf | 2022-05-30 |
| 14 | 201941038257-DRAWING [23-09-2020(online)].pdf | 2020-09-23 |
| 14 | 201941038257-PETITION UNDER RULE 137 [22-07-2022(online)].pdf | 2022-07-22 |
| 15 | 201941038257-FER_SER_REPLY [22-07-2022(online)].pdf | 2022-07-22 |
| 15 | 201941038257-FORM 13 [18-09-2020(online)].pdf | 2020-09-18 |
| 16 | 201941038257-DECLARATION OF INVENTORSHIP (FORM 5) [23-09-2019(online)].pdf | 2019-09-23 |
| 16 | 201941038257-US(14)-HearingNotice-(HearingDate-04-03-2024).pdf | 2024-02-17 |
| 17 | 201941038257-DRAWINGS [23-09-2019(online)].pdf | 2019-09-23 |
| 17 | 201941038257-FORM-26 [26-02-2024(online)].pdf | 2024-02-26 |
| 18 | 201941038257-Correspondence to notify the Controller [28-02-2024(online)].pdf | 2024-02-28 |
| 18 | 201941038257-FORM 1 [23-09-2019(online)].pdf | 2019-09-23 |
| 19 | 201941038257-Written submissions and relevant documents [18-03-2024(online)].pdf | 2024-03-18 |
| 19 | 201941038257-POWER OF AUTHORITY [23-09-2019(online)].pdf | 2019-09-23 |
| 20 | 201941038257-PROVISIONAL SPECIFICATION [23-09-2019(online)].pdf | 2019-09-23 |
| 20 | 201941038257-PatentCertificate26-03-2024.pdf | 2024-03-26 |
| 21 | 201941038257-STATEMENT OF UNDERTAKING (FORM 3) [23-09-2019(online)].pdf | 2019-09-23 |
| 21 | 201941038257-IntimationOfGrant26-03-2024.pdf | 2024-03-26 |
| 1 | sseraAE_29-12-2022.pdf |
| 1 | sserE_24-11-2021.pdf |
| 2 | sseraAE_29-12-2022.pdf |
| 2 | sserE_24-11-2021.pdf |