Abstract: A system for smart search of objects comprises an electronic device equipped with a camera and a smart search database. The camera, enabled for smart search of objects, captures snapshots of a plurality of objects at periodic time intervals. The captured snapshots are stored in the database for smart search of the objects. In an embodiment, a method of smart search of objects comprises the steps of capturing snapshots of an object at different orientations and distances by an electronic device equipped with a camera and registering the objects in the smart search database. The method of smart search further configures time intervals for capturing the snapshots of the objects and the captured snapshots of the objects are stored in the smart search database. Further, the electronic devices are synchronized and the database is updated with the snapshots of the objects with timestamp and location of the objects.
SYSTEM AND METHOD FOR SMART SEARCH FIELD OF INVENTION
[1] This invention relates to a smart search system, more particularly system and method of smart search for electronic devices equipped with a camera.
BACKGROUND
[2] Conventionally, a receiver tag is put on an object that is required to be tracked. These objects are tracked by means of a transmitter which transmits signals to the receiver tag. When the transmitter is switched ON, the receiver tags either glow or make a beep sound in response for the user to locate his lost object. However, the receiver tags suffer from inherent disadvantages. The receiver tags can get damaged or worn out over time or may slip off the object such that they might not work when needed. The objects themselves may also slide off to inaccessible corners such that during search even if they beep or flash in response they might not get noticed. Therefore, there exists a need for a system to address the above mentioned issues.
SUMMARY
[3] Embodiments of the present disclosure described herein provide a system and method of smart search for electronic devices equipped with a camera.
[4] In an embodiment, a system for smart search of objects comprises an electronic device equipped with a camera and a smart search database. The camera, enabled for smart search of objects, captures snapshots of the plurality of objects at periodic time intervals. The captured snapshots are stored in the database along with the timestamp and location information for smart search of the objects.
[5] In another embodiment, a method of smart search comprises the steps of capturing snapshots of an object at different orientations and distances by an electronic device equipped with a camera and registering the objects in the smart search database. The method of smart search further configures time intervals for capturing the snapshots of the objects and the captured snapshots of the objects are stored in the smart search database. Further, the database is synchronized with the snapshots of the objects with timestamp and location of the objects.
[6] Electronic devices equipped with a camera such as DTVs, mobile phones, laptops can be equipped with a "Smart Search" feature which can be used to track lost or misplaced objects such as keys, remote controllers, eye-glasses and wallets etc. which are often misplaced inside one's home or workplace.
[7] The smart search feature uses a sophisticated pattern recognition algorithm to recognize the objects. The objects are initially registered in a smart search database. The database stores images of the objects when viewed from different orientations at different distances. The database is commonly shared across all the electronic devices equipped with the smart search feature located within the house or workplace.
[8] When the smart search feature is enabled in the electronic devices equipped with a camera, the devices start capturing snapshots of the surroundings at regular time intervals. The objects in the snapshots are then analysed to match and locate the objects that have already been registered in the database. While scanning the snapshots, the smart search feature tries to identify the objects in the snapshot. The details of the objects identified in the snapshots are recorded in the database as identified by the smart search feature to update the current location of the object within the premises.
[9] To locate a particular object, a user provides the object details to be found. The smart search database provides the user with the location of the object with the timestamps captured in the snapshots of the object.
BRIEF DESCRIPTION OF FIGURES
[10] In the accompanying figures, similar reference numerals refer to identical or functionally similar elements. These reference numerals are used in the detailed description to illustrate various embodiments and to explain various aspects and advantages of the present disclosure.
[11] FIG. 1 illustrates a system for smart search in accordance with one embodiment.
[12] FIG. 2 illustrates a method of smart search In accordance with another embodiment.
[13] Persons skilled in the art will appreciate that elements in the figures are illustrated for simplicity and clarity and have not been drawn to scale. For example, the dimensions of some of the elements in the figure may be exaggerated relative to other elements to help to improve understanding of various embodiments of the present disclosure.
DETAILED DESCRIPTION
[14] It may be observed that the components of the system and method steps have been represented by conventional symbols in the figure, showing only specific details which are relevant for an understanding of the present disclosure. Further, details may be readily apparent to person ordinarily skilled in the art may not have been disclosed. In the present disclosure, relational terms, may be used to distinguish one entity from another entity, without necessarily implying any actual relationship or order between such entities.
[15] Embodiments in the present disclosure as described herein provide a system and a method of smart search.
[16] FIG. 1 illustrates a system 100 for tracking the objects by an electronic device like a digital television 105. The system 100 includes the digital television 105, an object, for example a wallet 110, a cell phone 115, a remote control 120, a key 125, an eye glass 130.
[17] In an embodiment, the system 100 shows the digital television 105 as a electronic device equipped with a smart search feature.
[18] The smart search feature uses a pattern recognition algorithm (not shown) to recognize and store the details of the objects that are registered in a database for incorporating the smart search feature. The database stores snapshot images of the objects captured from when viewed from different orientations and at different distances from the camera equipped electronic device. When the smart search feature is enabled, the electronic device equipped with a camera captures snapshots of its immediate surroundings at timed intervals. The smart search feature then extracts the details from the smart search database to locate any such object captured in the snapshot images. The snapshots are scanned to match and locate the objects that have already been registered in the database. While scanning the snapshots, the smart search feature tries to identify the objects in the snapshot. The details of the objects identified in the snapshots are recorded in the database as identified by the smart search feature to update the current location of the object within the premises.
[19] To locate an object, a user provides the details of the object to be searched to the electronic device having the smart search feature. The smart search database then provides the user with the location of the object with the timestamps captured in the snapshots of the desired object.
[20] In a preferred embodiment, the digital television 105 is installed at a particular location. The digital television 105 captures the snapshot of the location and registers the location. The object needed to be tracked by the digital television 105 is registered with the digital television 105 by holding the object in front of the camera of the digital television 105 and taking the snapshots of the object at different angles, and at different distances and store the snapshots in the database. This enables the digital television 105 to recognize the object whenever the digital television 105 sees the object.
[21] In one embodiment, the system 100 shows some of the common objects that are needed to be tracked in daily life like the wallet 110, the cell phone 115, the remote control 120, the key 125 and the eye glass 130.
[22] FIG. 2 illustrates a method of smart search for searching an object.
[23] FIG. 2 illustrates a method 200 illustrating the steps involved in Smart Search for tracking an object.
[24] At step 205, the user turns on the smart search feature.
[25] At step 210, the electronic device equipped with a camera registers the location in which the electronic device is placed. For example, if the digital television (DTV) is installed in the living room then there is a need to register only the living room location with the electronic device. However, for more mobile electronic devices, such as the laptop and the cell phone, the user will need to register all the locations within the home or the workplace. The registration needs to be done only for the first time.
[26] At step 215, the user registers the objects that are to be tracked by the smart search feature by holding the object in front of either the camera of the DTV, the cell phone, the laptop or any other camera equipped electronic device with the smart search feature.
[27] The smart search feature guides the user in registering the object by a guiding means like a voice command or an online text display of the orientations and the distances of the objects to be registered. This includes both close-up, short distance and long distance shots to enable the pattern recognition algorithm to recognize the object.
[28] At step 220, the user sets the time interval in the electronic device to take a snapshot of the surroundings.
[29] At step 225, the user turns on the smart search feature in all the electronic devices in home or workplace for tracking the object of interest.
[30] At step 230, a copy of the database is maintained on all the capable devices. The database is synchronized whenever the devices are connected to each other such as but not limited to a means of a Wi-Fi. All the electronic devices share the database.
[31] At step 235, the electronic device begins capturing snapshots of the immediate surroundings periodically. In the first snapshot the electronic device recognizes the location and makes a note of the location. In every subsequent snapshot the smart search feature on the electronic device searches the database to find whether there are any objects which are registered in the database in the snapshot. If the smart search feature finds any such object or objects, the details of the objects identified in the snapshot is recorded in the database as identified by the smart search feature to update the current location of the object along with its timestamp.
[32] Finally, the user makes sure all the electronic devices are synchronized with each other and the smart search database. The database is now updated with information from every smart search capable electronic device. To locate the object, a user provides the details of the object to be found. The smart search database then provides the user with the location of the object with the timestamps captured in the snapshots of the required object. This helps the user in getting to know the location and the time where the object was last seen before it was misplaced. This helps the user in narrowing down the search area and helps the user to easily locate the lost object.
[33] In the preceding description, the present disclosure and its advantages have been described with reference to specific embodiments. However, it will be apparent to a person of ordinary skill in the art that various modifications and changes can be made, without departing from the scope of the present disclosure, as set forth in the claims below. Accordingly, the specification and figures are to be regarded as illustrative examples of the present disclosure, rather than in restrictive sense. All such possible modifications are intended to be included within the scope of the present disclosure.
I/We claim:
1. A system for smart search of objects, the system comprising:
an electronic device enabled for smart search, equipped with a camera for capturing snapshots of a plurality of objects; and
a smart search database for storing the captured snapshots; wherein
the snapshots are captured at periodic time intervals and stored in the smart search database to trace an object..
2. The system as claimed in claim 1, wherein the plurality of objects is registered in the smart search database employ pattern recognizition techniques.
3. The system as claimed in claim 1, wherein the smart search database is accessible by the plurality of electronic devices equipped with a camera.
4. The system as claimed in claim 3, wherein the electronic devices equipped with a camera communicate with each other by means of Wi-Fi or any other suitable means.
5. The system as claimed in claim 4, wherein the plurality of electronic devices equipped with a camera are synchronized and the smart search database is updated at regular intervals.
6. The system as claimed in claim 1, wherein the captured snapshots are time stamped.
7. A method of smart search of objects, the method comprising:
capturing snapshots of an object from different orientations and distances by an electronic device equipped with a camera;
registering a plurality of objects in the smart search database;
configuring time intervals for capturing snapshots of the objects;
storing the captured snapshots of the objects in the database; and
synchronizing the plurality of electronic devices equipped with a camera and updating the database with the snapshots of the objects with timestamp and location of the objects.
8. The method as claimed in claim 7, wherein the electronic devices equipped with a camera are connected with each other by means of Wi-Fi or any other suitable means.
| # | Name | Date |
|---|---|---|
| 1 | 4030-che-2010 description(complete) 30-12-2010.pdf | 2010-12-30 |
| 1 | 4030-CHE-2010-AbandonedLetter.pdf | 2018-03-16 |
| 2 | 4030-che-2010 correspondence others 30-12-2010.pdf | 2010-12-30 |
| 2 | 4030-CHE-2010-FORM-26 [27-11-2017(online)].pdf | 2017-11-27 |
| 3 | 4030-CHE-2010-FER.pdf | 2017-09-07 |
| 3 | 4030-che-2010 power of attorney 30-12-2010.pdf | 2010-12-30 |
| 4 | Amended Form 1.pdf | 2015-07-17 |
| 4 | 4030-che-2010 form-5 30-12-2010.pdf | 2010-12-30 |
| 5 | Form 13_Address for service.pdf | 2015-07-17 |
| 5 | 4030-che-2010 form-3 30-12-2010.pdf | 2010-12-30 |
| 6 | 4030-CHE-2010 FORM-13 15-07-2015.pdf | 2015-07-15 |
| 6 | 4030-che-2010 form-2 30-12-2010.pdf | 2010-12-30 |
| 7 | 4030-che-2010 form-1 30-12-2010.pdf | 2010-12-30 |
| 7 | 4030-CHE-2010 POWER OF ATTORNEY 27-06-2011.pdf | 2011-06-27 |
| 8 | 4030-CHE-2010 FORM-18 27-06-2011.pdf | 2011-06-27 |
| 8 | 4030-che-2010 drawings 30-12-2010.pdf | 2010-12-30 |
| 9 | 4030-che-2010 abstract 30-12-2010.pdf | 2010-12-30 |
| 9 | 4030-CHE-2010 CORRESPONDENCE OTHERS 27-06-2011.pdf | 2011-06-27 |
| 10 | 4030-che-2010 claims 30-12-2010.pdf | 2010-12-30 |
| 10 | 4030-CHE-2010 FORM-1 19-01-2011.pdf | 2011-01-19 |
| 11 | 4030-che-2010 claims 30-12-2010.pdf | 2010-12-30 |
| 11 | 4030-CHE-2010 FORM-1 19-01-2011.pdf | 2011-01-19 |
| 12 | 4030-che-2010 abstract 30-12-2010.pdf | 2010-12-30 |
| 12 | 4030-CHE-2010 CORRESPONDENCE OTHERS 27-06-2011.pdf | 2011-06-27 |
| 13 | 4030-che-2010 drawings 30-12-2010.pdf | 2010-12-30 |
| 13 | 4030-CHE-2010 FORM-18 27-06-2011.pdf | 2011-06-27 |
| 14 | 4030-CHE-2010 POWER OF ATTORNEY 27-06-2011.pdf | 2011-06-27 |
| 14 | 4030-che-2010 form-1 30-12-2010.pdf | 2010-12-30 |
| 15 | 4030-che-2010 form-2 30-12-2010.pdf | 2010-12-30 |
| 15 | 4030-CHE-2010 FORM-13 15-07-2015.pdf | 2015-07-15 |
| 16 | 4030-che-2010 form-3 30-12-2010.pdf | 2010-12-30 |
| 16 | Form 13_Address for service.pdf | 2015-07-17 |
| 17 | 4030-che-2010 form-5 30-12-2010.pdf | 2010-12-30 |
| 17 | Amended Form 1.pdf | 2015-07-17 |
| 18 | 4030-CHE-2010-FER.pdf | 2017-09-07 |
| 18 | 4030-che-2010 power of attorney 30-12-2010.pdf | 2010-12-30 |
| 19 | 4030-CHE-2010-FORM-26 [27-11-2017(online)].pdf | 2017-11-27 |
| 19 | 4030-che-2010 correspondence others 30-12-2010.pdf | 2010-12-30 |
| 20 | 4030-CHE-2010-AbandonedLetter.pdf | 2018-03-16 |
| 20 | 4030-che-2010 description(complete) 30-12-2010.pdf | 2010-12-30 |
| 1 | 4030che2010_searchstrategy_10-08-2017.pdf |