Abstract: ABSTRACT A system to monitor and locate an entity in an indoor environment is provided. The system includes at least four ultra-wideband generators, and at least at least one transceiver. The system also includes at least one ultra-wideband tag. The system also includes a processing subsystem. The processing subsystem includes a distance identification module. The distance identification module is configured to transmit ultra-wideband waves. The distance identification module also configured receive a reflected plurality of ultra-wideband waves. The distance identification module also configured to identify distance. The distance identification module also configured to location of the entity. The system also includes an image capturing module configured to identify an activity of the entity. The system also includes a data merging module configured to merge the identified location of the entity with an identified activity of the entity. The system includes a monitoring module configured to monitor the activity of the entity. FIG. 1
Claims:WE CLAIM:
1. A system (10) to monitor and locate an entity in an indoor environment comprising:
at least one transceiver (40), and at least four ultra-wideband generators (30) operatively coupled to the entity (20), wherein the at least four ultra-wideband generators (50) are configured to generate a plurality of ultra-wideband waves;
at least one ultra-wideband tag (50) operatively coupled to the indoor environment in a predefined pattern, wherein the at least one ultra-wideband tag (50) is configured to:
receive the plurality of ultra-wideband waves from the entity (20);
transmit a received plurality of ultra-wideband waves to the entity (20);
a processing subsystem (60) comprises:
a distance identification module (70) configured to:
transmit the plurality ultra-wideband waves to the corresponding at least one ultra-wideband tag (50) by a first set transceivers of the entity (20);
receive a reflected plurality of ultra-wideband waves by a second set of transceivers of the entity (20);
identify distance between the entity and the at least one ultra-wideband tag (50);
identify a location of the entity (20) within the indoor environment by comparing an identified distance with a data representative of a predetermined distance and corresponding time required to receive the ultra-wide band waves for each of the at least one ultra-wideband tag (50);
an image capturing module (80) coupled to a plurality of image capturing devices, and the distance identification module (70), and configured to identify an activity of the entity in the indoor environment;
a data merging module (90) coupled to the image capturing module (80), and configured to merge the identified location of the entity (20) with an identified activity of the entity (20) by object detection and identification techniques; and
a monitoring module (100) coupled to the data merging module (90), and configured to monitor the activity of the entity (20) in real time.
2. The system (10) as claimed in claim 1, wherein the plurality of image capturing devices placed in the indoor environment in a predetermined manner.
3. The system (10) as claimed in claim 1, further comprising a notification module operatively coupled to the data merging module (90), and configured to generate a plurality of notifications to guide a user.
4. The system (10) as claimed in claim 4, wherein the plurality of notifications comprises at least one of a text notification, a push notification, and a multimedia notification.
5. A method (280) for monitoring and locating an entity in an indoor environment comprising:
generating, at least four ultra-wideband generators, a plurality of ultra-wideband waves;
receiving, by at least one ultra-wideband tag, the plurality of ultra-wideband waves from the entity;
transmitting, by the at least one ultra-wideband tag, a received plurality of ultra-wideband waves to the entity;
transmitting, by a distance identification module, the plurality ultra-wideband waves to the corresponding at least one ultra-wideband tag by a first set transceivers of the entity;
receiving, by the distance identification module, a reflected plurality of ultra-wideband waves by a second set of transceivers of the entity;
identifying, by the distance identification module, distance between the entity and the at least one ultra-wideband tag;
identifying, by the distance identification module, location of the entity by comparing an identified distance with a data representative of a predetermined distance and corresponding time required to receive the ultra-wide band waves for each of the at least one ultra-wideband tag;
identifying, by an image capturing module, an activity of the entity in the indoor environment;
merging, by a data merging module, the identified location of the entity with an identified activity of the entity by object detection and identification techniques; and
monitoring, by a monitoring module, the activity of the entity in real time.
6. The method (280) as claimed in claim 5, further comprising placing the plurality of image capturing devices in the indoor environment in a predetermined manner.
7. The method (280) as claimed in claim 5, further comprising generating a plurality of notifications to guide a user.
8. The method (280) as claimed in claim 7, generating notifications to guide a user comprising generating at least one of a text notification, a push notification, and a multimedia notification to guide a user.
Dated this 5th day of July 2019
Signature
Vidya Bhaskar Singh Nandiyal
IN/PA-2912
gent for the Applicant
, Description:FIELD OF INVENTION
[0001] Embodiments of a present disclosure relate to entity monitoring and locating, and more particularly to a system and method to monitor and locate an entity in an indoor environment.
BACKGROUND
[0002] Typical monitor and location finding systems will use GPS (Global Positioning System) technology to find location details for an outdoor environment. At the same time, when it comes to indoor environments the GPS technology is inadequate, and typically signals from the satellites are not accurate enough to monitor and locate the plurality of entities. Monitoring and localization of a plurality of entities in an indoor environment is an important process for industries, particularly for large-scale manufacturing industry. The indoor environment may also include a shopping mall, museum, amusement park, and the like. The indoor environment monitoring and locating system can be used to locate the plurality of entities inside buildings, typically via a computing device. Monitoring and locating services keep a user informed of changes in a position of the plurality of entities.
[0003] The conventional system may use combination of sensors (E,g., gyroscope sensor and accelerometer sensor) to monitor an activity of the plurality of entities, but the sensors (E.g., the gyroscopes sensor and the accelerometer sensor) fails to produce high accuracy results when there is any drift in position of the gyroscopes and the accelerometers. The magnetic sensors may provide high accuracy results when there is no magnetic interference in the indoor environment. If there is any magnetic interference in the indoor environment, the magnetic sensor fails to provide high accuracy results.
[0004] A newer system may use image capturing devices to monitor and locate the plurality of entities which leads to a manual process. The manual process makes the system slow, and the manual process will not allow the newer system to run as real time. Due to the manual monitoring and locating process, the newer system may be expensive for the manufacturing industry because the newer system needs to hire people for the manual monitoring and locating process.
[0005] Hence, there is a need for an improved system and method to monitor and locate an entity in an indoor environment to address the aforementioned issues.
BRIEF DESCRIPTION
[0006] In accordance with one embodiment of the disclosure, a system to monitor and locate an entity in an indoor environment is provided. The system includes at least at least one transceiver, and at least four ultra-wideband generators operatively coupled to the entity. The at least four ultra-wideband generators are configured to generate a plurality of ultra-wideband waves.
[0007] The system also includes at least one ultra-wideband tag operatively coupled to the indoor environment in a predefined pattern. The at least one ultra-wideband tag is configured to receive the plurality of ultra-wideband waves from the entity. The at least one ultra-wideband tag also configured to transmit a received plurality of ultra-wideband waves to the entity.
[0008] The system also includes a processing subsystem. The processing subsystem includes a distance identification module. The distance identification module is configured to transmit the plurality ultra-wideband waves to the corresponding at least one ultra-wideband tag by a first set transceivers of the entity. The distance identification module also configured to receive a reflected plurality of ultra-wideband waves by a second set of transceivers of the entity. The distance identification module also configured to identify distance between the entity and the at least one ultra-wideband tag.
[0009] The distance identification module also configured to identify location of the entity by comparing an identified distance with a data representative of a predetermined distance and corresponding time required to receive the ultra-wide band waves for each of the at least one ultra-wideband tag.
[0010] The system also includes an image capturing module coupled to a plurality of image capturing devices the distance identification module. The image capturing module is configured to identify an activity of the entity in the indoor environment. The system also includes a data merging module coupled to the image capturing module. The data merging module is configured to merge the identified location of the entity with an identified activity of the entity by object detection and identification techniques. The system also includes a monitoring module coupled to the comparison module. The comparison module is configured to monitor activity of the entity in real time.
[0011] In accordance with another embodiment of the disclosure, a method for monitoring and locating an entity in an indoor environment is provided. The also includes receiving a plurality of ultra-wideband waves from the entity. The method also includes transmitting a received plurality of ultra-wideband waves to the entity. The method includes transmitting the plurality ultra-wideband waves to the corresponding at least one ultra-wideband tag by a first set transceivers of the entity by a distance identification module. The method also includes receiving a reflected plurality of ultra-wideband waves by a second set of transceivers of the entity by the distance identification module. The method also includes identifying distance between the entity and the at least one ultra-wideband tag by the distance identification module. The method also includes location of the entity by comparing an identified distance with a data representative of a predetermined distance and corresponding time required to receive the ultra-wide band waves for each of the at least one ultra-wideband tag by the distance identification module. The method also includes identifying an activity of the entity in the indoor environment by an image capturing module. The method also includes merging the identified location of the entity with an identified activity of the entity by object detection and identification techniques by a data merging module. The method also includes monitoring the activity of the entity in real time by a monitoring module.
[0012] To further clarify the advantages and features of the present disclosure, a more particular description of the disclosure will follow by reference to specific embodiments thereof, which are illustrated in the appended figures. It is to be appreciated that these figures depict only typical embodiments of the disclosure and are therefore not to be considered limiting in scope. The disclosure will be described and explained with additional specificity and detail with the appended figures.
BRIEF DESCRIPTION OF THE DRAWINGS
The disclosure will be described and explained with additional specificity and detail with the accompanying figures in which:
[0013] FIG. 1 is a block diagram representation of a system to monitor and locate an entity in an indoor environment in accordance with an embodiment of the present disclosure;
[0014] FIG. 2 is a block diagram representation of one embodiment of the system to monitor and locate the entity in the indoor environment of FIG. 1 in accordance with an embodiment of the present disclosure;
[0015] FIG. 3 block diagram of a computer or a server of the system of FIG. 1in accordance with an embodiment of the present disclosure; and
[0016] FIG. 4 is a flow diagram representing steps involved in a method for monitoring and locating an entity in an indoor environment in accordance with an embodiment of the present disclosure.
[0017] Further, those skilled in the art will appreciate that elements in the figures are illustrated for simplicity and may not have necessarily been drawn to scale. Furthermore, in terms of the construction of the device, one or more components of the device may have been represented in the figures by conventional symbols, and the figures may show only those specific details that are pertinent to understanding the embodiments of the present disclosure so as not to obscure the figures with details that will be readily apparent to those skilled in the art having the benefit of the description herein.
DETAILED DESCRIPTION
[0018] For the purpose of promoting an understanding of the principles of the disclosure, reference will now be made to the embodiment illustrated in the figures and specific language will be used to describe them. It will nevertheless be understood that no limitation of the scope of the disclosure is thereby intended. Such alterations and further modifications in the illustrated system, and such further applications of the principles of the disclosure as would normally occur to those skilled in the art are to be construed as being within the scope of the present disclosure.
[0019] The terms "comprise", "comprising", or any other variations thereof, are intended to cover a non-exclusive inclusion, such that a process or method that comprises a list of steps does not include only those steps but may include other steps not expressly listed or inherent to such a process or method. Similarly, one or more devices or sub-systems or elements or structures or components preceded by "comprises... a" does not, without more constraints, preclude the existence of other devices, sub-systems, elements, structures, components, additional devices, additional sub-systems, additional elements, additional structures or additional components. Appearances of the phrase "in an embodiment", "in another embodiment" and similar language throughout this specification may, but not necessarily do, all refer to the same embodiment.
[0020] Unless otherwise defined, all technical and scientific terms used herein have the same meaning as commonly understood by those skilled in the art to which this disclosure belongs. The system, methods, and examples provided herein are only illustrative and not intended to be limiting.
[0021] In the following specification and the claims, reference will be made to a number of terms, which shall be defined to have the following meanings. The singular forms “a”, “an”, and “the” include plural references unless the context clearly dictates otherwise.
[0022] Embodiments of the present disclosure relate to a system to monitor and locate an entity in an indoor environment. The system includes at least at least one transceiver, and at least four ultra-wideband generators operatively coupled to the entity. The at least four ultra-wideband generators are configured to generate a plurality of ultra-wideband waves. The system also includes at least one ultra-wideband tag operatively coupled to the indoor environment in a predefined pattern. The at least one ultra-wideband tag is configured to receive the plurality of ultra-wideband waves from the entity. The at least one ultra-wideband tag also configured to transmit a received plurality of ultra-wideband waves to the entity. The system also includes a processing subsystem. The processing subsystem includes a distance identification module. The distance identification module is configured to transmit the plurality ultra-wideband waves to the corresponding at least one ultra-wideband tag by a first set transceivers of the entity. The distance identification module also configured to receive a reflected plurality of ultra-wideband waves by a second set of transceivers of the entity. The distance identification module also configured to identify distance between the entity and the at least one ultra-wideband tag. The distance identification module also configured to identify location of the entity by comparing an identified distance with a data representative of a predetermined distance and corresponding time required to receive the ultra-wide band waves for each of the at least one ultra-wideband tag. The system also includes an image capturing module coupled to a plurality of image capturing devices the distance identification module. The image capturing module is configured to identify an activity of the entity in the indoor environment. The system also includes a data merging module coupled to the image capturing module. The data merging module is configured to merge the identified location of the entity with an identified activity of the entity by object detection and identification techniques. The system also includes a monitoring module coupled to the comparison module. The comparison module is configured to monitor activity of the entity in real time.
[0023] FIG. 1 is a block diagram representation of a system (10) to monitor and locate an entity in an indoor environment in accordance with an embodiment of the present disclosure. The system (10) includes at least one transceiver (40), and at least four ultra-wideband generators (30) operatively coupled to the entity (20). The at least four ultra-wideband generators (30) are configured to generate a plurality of ultra-wideband waves. In one embodiment, the at least four ultra-wideband generators (30) may be an ultra-wideband anchor which configured to generate plurality of ultra-wideband waves. In one embodiment, the entity (20) may be one of a movable object and a portable object which is manufactured by industry in the indoor environment.
[0024] As used herein, the term ‘transceiver’ is defined as a device that can both transmit and receive communications, in particular a combined radio transmitter and receiver. In one embodiment, working frequency of the transceiver may be 6.65 G Hz. In another embodiment, the entity (20) may include one of a forklift, a manual cart, a motorised cart, a tow-tugger, a tow-truck or a fork truck which may be used in managing the logistics or a cargo delivery. In yet another embodiment, the entity may be an autonomous vehicle. In yet another embodiment, the entity may include a user (a human being) within the indoor location.
[0025] The system (10) also includes at least one ultra-wideband tag (50) operatively coupled to the indoor environment in a predefined pattern. The at least one ultra-wideband tag (50) is configured to receive the plurality of ultra-wideband waves from the entity (20). In one embodiment, the at least one ultra-wideband tag (50) may receive the plurality of ultra-wideband waves from a plurality of entities (20). The at least one ultra-wideband tag (50) also configured to transmit a received plurality of ultra-wideband waves to the entity (20). The number of the at least one ultra-wideband tag (50) may vary on size of the indoor environment. In one embodiment, the indoor environment may include, but not limited to, shopping mall, museum, amusement park logistic area and the like. As used herein, the term ‘ultra-wideband’ is defined as a radio technology that can use a very low energy level for short-range, high-bandwidth communications over a large portion of the radio spectrum. In one embodiment, the at least one ultra-wideband tag (50) may receive the plurality of ultra-wideband waves at a time.
[0026] The processing subsystem (60) includes a distance identification module (70). The distance identification module (70) is configured to transmit the plurality of ultra-wideband waves to the corresponding at least one ultra-wideband tag (50) by a first set transceivers of the entity (20). The transmitting of the ultra-wideband waves performed by a first set of at least one transceiver (40). The distance identification module (70) also a reflected plurality of ultra-wideband waves by a second set of transceivers of the entity (20) back to a place where the entity (20) may be previously located. The plurality of ultra-wideband waves may be received by the second set of transceivers of the entity (20).
[0027] Furthermore, the distance identification module (70) also configured to identify distance between the entity and the at least one ultra-wideband tag (50). In one embodiment, the identification of the distance between the entity and the at least one ultra-wideband tag (50) may include calculation of the distance between the entity and the at least one ultra-wideband tag (50) based on speed of transmission and reception of the ultra-wideband waves and time taken by the ultra-wideband waves to reach the entity and the at least one ultra-wideband tag (50).
[0028] The distance identification module (70) also configured to identify location of the entity (20) by comparing an identified distance with a data representative of a predetermined distance and corresponding time required to receive the ultra-wide band waves for each of the at least one ultra-wideband tag (50). In one embodiment, at least one ultra-wide band tag (50) and at least three transceivers are required to find the location of the entity (20). In one embodiment, the predetermined distance is calculated and pre-stored in a database. In one such embodiment, the database may be stored on a local server. In such another embodiment, the database may be stored on a remote server such as a cloud server.
[0029] Furthermore, the system (10) also includes an image capturing module (80) coupled to a plurality of image capturing devices and the distance identification module (70). The image capturing module (70) is configured to identify an activity of the entity in the indoor environment. In one embodiment, the image capturing devices may include, but not limited to, a camera. In such embodiment, the camera is configured to at least one of one or more images and one or more videos. In one embodiment, the activity of the entity may include a movement of the entity (20) such as walking, running, jogging, and the like in case of the entity being the human. In another embodiment, the movement of the entity may include a speed of the object, direction of movement of the object, path of movement of the object and the like. In one embodiment, the plurality of image capturing devices is placed in the indoor environment in a predetermined manner.
[0030] Furthermore, the system (10) also includes a data merging module (90) coupled to the image capturing module (80). The data merging module (90) is configured to merge the identified location of the entity with an identified activity of the entity by object detection and identification techniques. As used herein, the term ‘object detection and identification’ defined as a computer technology related to computer vision and image processing that deals with detecting and identifying instances of semantic objects of a certain class (such as humans, buildings, or cars) in digital images and videos. The system (10) also includes a monitoring module (100) coupled to the data merging module (90). The monitoring module (100) is configured to monitor the activity of the entity (20) in real time.
[0031] Furthermore, the system (10) also includes a notification module operatively coupled to the data merging module (90). The notification module is configured to generate a plurality of notifications to guide the entity (20). In one embodiment, the plurality of notifications includes at least one of a text notification, a push notification, and a multimedia notification. In such embodiment, the user may receive the notification via a computing device. In such embodiment, the computing device may be a hand-held device. In one embodiment, the computing device may include, but not limited to, a laptop, a desktop, a notebook, a tablet, a smartphone and the like. In such another embodiment, the computing device may be a portable device. In one embodiment, the user needs to register himself in the system (10) upon providing user details by using the computing device. In such embodiment, the user details may include, but not limited to, a name of the user, an E-Mail of the user, a phone number of the user, a nationality of the user and the like.
[0032] In one exemplary embodiment, the system (10) includes the at least four ultra-wideband generators (30) to set up a network to monitor the entity (20) in the indoor environment. The system (10) also includes at least three ultra-wideband generators (30) to locate and monitor the entity (20) in the indoor environment.
[0033] FIG. 2 is a block diagram representation of one embodiment of the system (110) to monitor and locate the entity in the indoor environment of FIG. 1 in accordance with an embodiment of the present disclosure. The system (110) provides a platform for computer vision and location information which is integrated for monitoring and locating the entity inside a mall. A user ‘X’ (120) register himself with the platform upon providing details of the user ‘X’ (120) through a smart phone (130). Further as the user ‘X’ (120) enters the mall, locating and monitoring of the user ‘X’ (120) within the mall in real-time is being incorporated.
[0034] A distance identification module (190) transmits a plurality of ultra-wideband waves to the corresponding at least one ultra-wideband tag (170) by a first set transceivers of the entity (140), wherein the plurality of ultra-wideband waves is generated by an ultra-wideband waves generator operatively coupled to an indoor environment of the mall in every corner of the mall. The distance identification module (190) receives reflected plurality of ultra-wideband waves by a second set of transceivers operatively coupled to the entity (140). The distance identification module (190) identifies distance between the entity and the at least one ultra-wideband tag (170). The distance identification module (190) identify a location of the entity (140) within the mall by comparing an identified distance with a data representative of a predetermined distance and corresponding time required to receive the ultra-wide band waves for each of the at least one ultra-wideband tag (170).
[0035] The system (110) also includes an image capturing module (200) coupled to a plurality of cameras, and the distance identification module (190), and configured to identify forward movement, and a reverse movement of the of the entity (140) in the mall. The system (110) also includes a data merging module (210) which is configured to merge the identified location of the entity with an identified activity of the entity by implementing an object detection and identification technique. The system (110) also includes a monitoring module (220) is configured to monitor the activity of the entity in real time. The system (110) also includes a notification module (230) is configured to generate a multimedia notification to guide the entity (120).
[0036] The entity (140), at least four ultra-wideband generators (150), the transceiver (160), the at least one ultra-wideband tag (170), the processing subsystem (180), the distance identification module (190), the image capturing module (200), the data merging module (210), the monitoring module (220), and the notification module (230) in the FIG. 2 are substantially similar to an entity (20), the at least four ultra-wideband generators (30), a transceiver (40) at least one ultra-wideband tag (50), a processing subsystem (60), a distance identification module (70), an image capturing module (80), a data merging module (90), and a monitoring module (100) of FIG. 1.
[0037] FIG. 3 block diagram of a computer or a server of the system (240) of FIG. 1in accordance with an embodiment of the present disclosure. The system (240) includes a processor(s) (270), and a memory (250) coupled to the processor(s) (270). The processor(s) (270), as used herein, means any type of computational circuit, such as, but not limited to, a microprocessor, a microcontroller, a complex instruction set computing microprocessor, a reduced instruction set computing microprocessor, a very long instruction word microprocessor, an explicitly parallel instruction computing microprocessor, a digital signal processor, or any other type of processing circuit, or a combination thereof.
[0038] Computer memory elements may include any suitable memory device(s) for storing data and executable program, such as read only memory, random access memory, erasable programmable read only memory, electrically erasable programmable read only memory, hard drive, removable media drive for handling memory cards and the like. Embodiments of the present subject matter may be implemented in conjunction with program modules, including functions, procedures, data structures, and application programs, for performing tasks, or defining abstract data types or low-level hardware contexts. Executable program stored on any of the above-mentioned storage media may be executable by the processor(s).
[0039] The memory (250) includes a plurality of modules stored in the form of executable program which instructs the processor to perform designated steps. The memory (250) has following modules: an entity (20), at least four ultra-wideband generators (30), at least one transceiver (40), and at least one ultra-wideband tag (50), and a processing subsystem (60).
[0040] The at least one transceiver (40), and the at least four ultra-wideband generators operatively coupled to the entity (20). The at least four ultra-wideband generators are configured to generate a plurality of ultra-wideband waves. The at least one ultra-wideband tag (50) operatively coupled to the indoor environment in a predefined pattern. The at least one ultra-wideband tag (50) is configured to receive the plurality of ultra-wideband waves from the entity. The at least one ultra-wideband tag (50) also configured to transmit a received plurality of ultra-wideband waves to the entity (20). The processing subsystem (60) include a distance identification module (70) configured to transmit the plurality ultra-wideband waves to the corresponding at least one ultra-wideband tag (50) by a first set transceivers of the entity (20). The distance identification module (70) also configured to receive a reflected plurality of ultra-wideband waves by a second set of transceivers of the entity (20). The distance identification module (70) also configured to identify distance between the entity and the at least one ultra-wideband tag (50). The distance identification module (70) also configured to identify location of the entity (20) by comparing an identified distance with a data representative of a predetermined distance and corresponding time required to receive the ultra-wide band waves for each of the at least one ultra-wideband tag (50). The system (10) also includes an image capturing module (80) coupled to a plurality of image capturing devices, and the distance identification module (70), and configured to identify an activity of the entity in the indoor environment. The system (10) also includes a data merging module (90) coupled to the image capturing module (80), and configured to merge the identified location of the entity (20) with an identified activity of the entity (20) by object detection and identification techniques. The system (10) also includes a monitoring module (100) coupled to the data merging module (90), and configured to monitor the activity of the entity (20) in real time.
[0041] FIG. 4 is a flow diagram representing steps involved in a method (280) for monitoring and locating an entity in an indoor environment in accordance with an embodiment of the present disclosure.
[0042] The method (280) also includes generating a plurality of ultra-wideband waves in step 290. In one embodiment, generating the plurality of ultra-wideband waves includes generating the plurality of ultra-wideband waves by at least four ultra-wideband generators. In one embodiment, the entity may be one of a movable object and a portable object which is manufactured by industry in the indoor environment. In another embodiment, the entity may include one of a forklift, a manual cart, a motorised cart, a tow-tugger, a tow-truck or a fork truck which may be used in managing the logistics or a cargo delivery. In yet another embodiment, the entity may be an autonomous vehicle. In yet another embodiment, the entity may include a user (a human being) within the indoor location.
[0043] The method (280) includes receiving the plurality of ultra-wideband waves from the entity in step 300. In one embodiment, receiving the plurality of ultra-wideband waves includes receiving the plurality of ultra-wideband waves from the entity by at least one ultra-wideband tag.
[0044] The method (280) also includes transmitting a received plurality of ultra-wideband waves to the entity in step 310. In one embodiment, transmitting the received plurality of ultra-wideband waves includes transmitting the received plurality of ultra-wideband waves to the entity by the at least one ultra-wideband tag. The number of the at least one ultra-wideband tag may vary on size of the indoor environment. In one embodiment, the indoor environment may include, but not limited to, shopping mall, museum, amusement park logistic area and the like. As used herein, the term ‘ultra-wideband’ is defined as a radio technology that can use a very low energy level for short-range, high-bandwidth communications over a large portion of the radio spectrum.
[0045] The method (280) includes transmitting the plurality ultra-wideband waves to the corresponding at least one ultra-wideband tag by a first set transceivers of the entity in step 320. In one embodiment, transmitting the plurality ultra-wideband waves to the corresponding at least one ultra-wideband tag includes transmitting the plurality ultra-wideband waves to the corresponding at least one ultra-wideband tag by a first set transceivers of the entity by a distance identification module.
[0046] The method (280) also includes receiving a reflected plurality of ultra-wideband waves by a second set of transceivers of the entity in step 330. In one embodiment, receiving the reflected plurality of ultra-wideband waves includes receiving the reflected plurality of ultra-wideband waves by the second set of transceivers of the entity by the distance identification module. In one embodiment, a reflected plurality of ultra-wideband waves by a second set of transceivers of the entity back to a place where the entity may be previously located. The plurality of ultra-wideband waves may be received by the second set of transceivers of the entity.
[0047] The method (280) also includes identifying distance between the entity and the at least one ultra-wideband tag in step 340. In one embodiment, identifying distance includes identifying the distance between the entity and the at least one ultra-wideband tag by the distance identification module. In one embodiment, the identification of the distance between the entity and the at least one ultra-wideband tag may include calculation of the distance between the entity and the at least one ultra-wideband tag based on speed of transmission and reception of the ultra-wideband waves and time taken by the ultra-wideband waves to reach the entity and the at least one ultra-wideband tag.
[0048] The method (280) also includes identifying location of the entity by comparing an identified distance with a data representative of a predetermined distance and corresponding time required to receive the ultra-wide band waves for each of the at least one ultra-wideband tag in step 350. In one embodiment, identifying the location of the entity includes identifying the location of the entity by comparing the identified distance with the data representative of the predetermined distance and corresponding time required to receive the ultra-wide band waves for each of the at least one ultra-wideband tag by the distance identification module. In one embodiment, at least one ultra-wide band tag (50) and at least three transceivers are required to find the location of the entity (20). In one embodiment, the predetermined distance is calculated and pre-stored in a database. In one such embodiment, the database may be stored on a local server. In such another embodiment, the database may be stored on a remote server such as a cloud server.
[0049] The method (280) also includes identifying an activity of the entity in the indoor environment in step 360. In one embodiment, identifying the activity of the entity includes identifying the activity of the entity in the indoor environment by an image capturing module. In one embodiment, the image capturing devices may include, but not limited to, a camera. In such embodiment, the camera is configured to at least one of one or more images and one or more videos. In one embodiment, the activity of the entity may include a movement of the entity (20) such as walking, running, jogging, and the like in case of the entity being the human. In another embodiment, the movement of the entity may include a speed of the object, direction of movement of the object, path of movement of the object and the like. In one embodiment, the plurality of image capturing devices is placed in the indoor environment in a predetermined manner.
[0050] The method (280) also includes merging the identified location of the entity with an identified activity of the entity by object detection and identification techniques in step 370. In one embodiment, merging the identified location of the entity includes merging the identified location of the entity with the identified activity of the entity by object detection and identification techniques by a data merging module. The method (280) also includes monitoring the activity of the entity in real time in step 380. In one embodiment, monitoring the activity of the entity includes monitoring the activity of the entity in real time by a monitoring module.
[0051] In some embodiments, the method (280) also includes placing the plurality of image capturing devices in the indoor environment in a predetermined manner. In some embodiments, the method (280) also includes generating a plurality of notifications to guide a user to guide the entity. In one embodiment, the plurality of notifications may include, but not limited to, a text notification, a push notification, and a multimedia notification. In such embodiment, the user may receive the notification via a computing device. In such embodiment, the computing device may be a hand-held device. In one embodiment, the computing device may include, but not limited to, a laptop, a desktop, a notebook, a tablet, a smartphone and the like. In such another embodiment, the computing device may be a portable device.
[0052] In exemplary embodiment, the method (280) includes the at least four ultra-wideband generators to set up a network to monitor the entity in the indoor environment. The method (280) also includes at least three ultra-wideband generators to locate and monitor the entity in the indoor environment.
[0053] Various embodiments of the present disclosure enable the system to increase accuracy the identified distance by comparing the identified distance with the predetermined distance. The present disclosure increases the speed of the system by identifying the activity of the entity by the image capturing module automatically. In addition to that the present disclosure merges the identified location of the entity (sensor data) with an identified activity of the entity (a computer vision) by object detection and identification techniques to obtain better accuracy.
[0054] While specific language has been used to describe the disclosure, any limitations arising on account of the same are not intended. As would be apparent to a person skilled in the art, various working modifications may be made to the method in order to implement the inventive concept as taught herein.
[0055] The figures and the foregoing description give examples of embodiments. Those skilled in the art will appreciate that one or more of the described elements may well be combined into a single functional element. Alternatively, certain elements may be split into multiple functional elements. Elements from one embodiment may be added to another embodiment. For example, order of processes described herein may be changed and are not limited to the manner described herein. Moreover, the actions of any flow diagram need not be implemented in the order shown; nor do all of the acts need to be necessarily performed. Also, those acts that are not dependant on other acts may be performed in parallel with the other acts. The scope of embodiments is by no means limited by these specific examples.
| # | Name | Date |
|---|---|---|
| 1 | 201941027083-FER.pdf | 2019-09-23 |
| 1 | 201941027083-STATEMENT OF UNDERTAKING (FORM 3) [05-07-2019(online)].pdf | 2019-07-05 |
| 2 | 201941027083-PROOF OF RIGHT [05-07-2019(online)].pdf | 2019-07-05 |
| 2 | 201941027083-FORM 18A [02-08-2019(online)].pdf | 2019-08-02 |
| 3 | 201941027083-POWER OF AUTHORITY [05-07-2019(online)].pdf | 2019-07-05 |
| 3 | 201941027083-FORM-9 [02-08-2019(online)].pdf | 2019-08-02 |
| 4 | Correspondence by Agent_Form -1, Form-3, Form- 5, Form-28, DIPP Certificate And POA__11-07-2019.pdf | 2019-07-11 |
| 4 | 201941027083-FORM FOR STARTUP [05-07-2019(online)].pdf | 2019-07-05 |
| 5 | 201941027083-FORM FOR SMALL ENTITY(FORM-28) [05-07-2019(online)].pdf | 2019-07-05 |
| 5 | 201941027083-COMPLETE SPECIFICATION [05-07-2019(online)].pdf | 2019-07-05 |
| 6 | 201941027083-FORM 1 [05-07-2019(online)].pdf | 2019-07-05 |
| 6 | 201941027083-DECLARATION OF INVENTORSHIP (FORM 5) [05-07-2019(online)].pdf | 2019-07-05 |
| 7 | 201941027083-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [05-07-2019(online)].pdf | 2019-07-05 |
| 7 | 201941027083-DRAWINGS [05-07-2019(online)].pdf | 2019-07-05 |
| 8 | 201941027083-EVIDENCE FOR REGISTRATION UNDER SSI [05-07-2019(online)].pdf | 2019-07-05 |
| 9 | 201941027083-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [05-07-2019(online)].pdf | 2019-07-05 |
| 9 | 201941027083-DRAWINGS [05-07-2019(online)].pdf | 2019-07-05 |
| 10 | 201941027083-DECLARATION OF INVENTORSHIP (FORM 5) [05-07-2019(online)].pdf | 2019-07-05 |
| 10 | 201941027083-FORM 1 [05-07-2019(online)].pdf | 2019-07-05 |
| 11 | 201941027083-FORM FOR SMALL ENTITY(FORM-28) [05-07-2019(online)].pdf | 2019-07-05 |
| 11 | 201941027083-COMPLETE SPECIFICATION [05-07-2019(online)].pdf | 2019-07-05 |
| 12 | Correspondence by Agent_Form -1, Form-3, Form- 5, Form-28, DIPP Certificate And POA__11-07-2019.pdf | 2019-07-11 |
| 12 | 201941027083-FORM FOR STARTUP [05-07-2019(online)].pdf | 2019-07-05 |
| 13 | 201941027083-POWER OF AUTHORITY [05-07-2019(online)].pdf | 2019-07-05 |
| 13 | 201941027083-FORM-9 [02-08-2019(online)].pdf | 2019-08-02 |
| 14 | 201941027083-PROOF OF RIGHT [05-07-2019(online)].pdf | 2019-07-05 |
| 14 | 201941027083-FORM 18A [02-08-2019(online)].pdf | 2019-08-02 |
| 15 | 201941027083-STATEMENT OF UNDERTAKING (FORM 3) [05-07-2019(online)].pdf | 2019-07-05 |
| 15 | 201941027083-FER.pdf | 2019-09-23 |
| 1 | 201941027083ss_19-09-2019.pdf |