Abstract: IOT AND MACHINE LEARNING BASED SMART DEVICE WITH PARENTAL CONTROL FOR CHILD SAFETY AND MONITORING ABSTRACT A smart device (100) with parental control for child safety and monitoring is disclosed. The smart device (100) comprises a camera (102) adapted to capture images and an ultrasonic sensor (106) arranged in a wearable device worn by a user. The ultrasonic sensor emits ultrasonic waves and measures the bounce-back time of these waves. A processing unit (108) connected to the camera is configured to receive the captured images, detect harmful objects using a machine learning model, generate bounding boxes around detected harmful objects, and calculate the distance to these objects based on the ultrasonic sensor's measurements. When the calculated distance is less than a programmed threshold, the processing unit performs corrective measures such as actuating a buzzer (112) or transmitting notifications to a user device (114). The smart device (100) ensures child safety by identifying and responding to potential hazards. Claims: 10, Figures: 4 Figure 1A is selected.
Description:BACKGROUND
Field of Invention
[001] Embodiments of the present invention generally relate to a baby safety device and particularly to an internet of things and machine learning based smart device with parental control for child safety and monitoring.
Description of Related Art
[002] Parents today, face significant challenges in managing activities of their young children, aged between 6 months and 5 years. These children often engage in potentially dangerous activities, such as wandering around a house, touching electronic devices, entering a kitchen, and attempting to leave the house, especially when their parents are occupied with household chores. These behaviors can lead to injuries and create substantial concerns for parents, who constantly seek ways to ensure their children's safety while managing their household responsibilities.
[003] Numerous existing solutions address aspects of child monitoring and safety. Some of the notable examples include wristbands that are designed to enhance child protection by monitoring the child's location within specific zones. The wristbands further provide real-time tracking and alert parents if the child moves out of designated safe areas. Further, other smart wristbands for children feature sensors integrated into a microcontroller for continuously gathering data from these sensors. Conventional systems and trackers are known to employ regression techniques using sensor data to predict the child's status and potential risks.
[004] These existing solutions provide various features aimed at improving child safety through location tracking, sensor integration, and real-time alerts. However, there remains a need for a comprehensive and user-friendly device that integrates these features into a single, wearable form factor, offering a seamless and effective solution for parents to manage and monitor their young children's activities efficiently.
[005] There is thus a need for a smart device with parental control that can administer the aforementioned limitations in a more efficient manner.
SUMMARY
[006] Embodiments in accordance with the present invention provide a smart device with Parental control for child safety and monitoring. The smart device comprising: a camera adapted to capture images. The smart device further comprising an ultrasonic sensor arranged in a wearable device worn by a user, wherein the ultrasonic sensor emits ultrasonic waves and measures a bounce-back time of the emitted ultrasonic waves. The smart device further comprising a processing unit connected to the camera. The processing unit is configured to: receive the captured images from the camera; detect a harmful object in the captured images using a machine learning model; generate bounding boxes around the detected harmful object in the captured images; calculate a distance to the detected harmful objects based on the bounce-back time measured by the ultrasonic sensor; and perform corrective measures, when the calculated distance to the detected harmful object is less than a programmed threshold.
[007] Embodiments in accordance with the present invention further provide a method for method for child safety and monitoring using a smart device, comprising steps of: receiving captured images from a camera; detecting a harmful object in the captured images using a convolutional neural network (CNN) model; generating bounding boxes around the detected harmful object in the captured images; calculating a distance to the detected harmful objects based on a bounce-back time measured by an ultrasonic sensor; and performing corrective measures, when the calculated distance to the detected harmful object is less than a programmed threshold.
[008] Embodiments of the present invention may provide a number of advantages depending on their particular configuration. First, embodiments of the present application may provide a smart device with an integrated safety solution. This integration enhances an overall effectiveness and usability for parents, ensuring that children are monitored continuously and accurately, reducing the likelihood of injury from hazardous objects or environments.
[009] Next, embodiments of the present application may provide a smart device that can be easily integrated into kids' rooms, existing gear, cribs, and manual rockers designated for babies, toddlers, and young kids. This adaptability ensures that the smart device can be utilized in various settings, making it a versatile tool for child safety.
[0010] Next, embodiments of the present application may provide a smart device for a user-friendly experience for ensuring that parents can easily operate and configure the device.
[0011] Next, embodiments of the present application may provide a smart device for real-time monitoring and alerting. By using advanced sensors and machine learning models, the smart device can promptly detect potential hazards and alert parents by instantly enabling swift preventive actions.
[0012] Next, embodiments of the present application may provide a smart device that is capable of continuous learning and adaptation. The machine learning model can be trained and updated over time to improve its accuracy in detecting harmful objects and scenarios to ensure that the smart device remains effective as a child grows and their environment changes.
[0013] Next, embodiments of the present application may provide a smart device that is suitable for both indoor and outdoor use. The robust design and advanced sensor capabilities allow the device to function effectively in various environments by offering comprehensive protection for children regardless of their location.
[0014] Next, embodiments of the present application may provide a smart device that helps in reducing a parental stress and anxiety. By offering reliable and continuous monitoring, parents can feel more at ease knowing that their children are safe and protected, allowing them to focus on other tasks and responsibilities with greater peace of mind.
[0015] These and other advantages will be apparent from the present application of the embodiments described herein.
[0016] The preceding is a simplified summary to provide an understanding of some embodiments of the present invention. This summary is neither an extensive nor exhaustive overview of the present invention and its various embodiments. The summary presents selected concepts of the embodiments of the present invention in a simplified form as an introduction to the more detailed description presented below. As will be appreciated, other embodiments of the present invention are possible utilizing, alone or in combination, one or more of the features set forth above or described in detail below.
BRIEF DESCRIPTION OF THE DRAWINGS
[0017] The above and still further features and advantages of embodiments of the present invention will become apparent upon consideration of the following detailed description of embodiments thereof, especially when taken in conjunction with the accompanying drawings, and wherein:
[0018] FIG. 1A illustrates a block diagram of a smart device, according to an embodiment of the present invention;
[0019] FIG. 1B illustrates a prototype of the smart device, according to an embodiment of the present invention;
[0020] FIG. 2 illustrates a block diagram of a processing unit of the smart device, according to an embodiment of the present invention; and
[0021] FIG. 3 depicts a flowchart of a method for child safety and monitoring using the smart device, according to an embodiment of the present invention.
[0022] The headings used herein are for organizational purposes only and are not meant to be used to limit the scope of the description or the claims. As used throughout this application, the word "may" is used in a permissive sense (i.e., meaning having the potential to), rather than the mandatory sense (i.e., meaning must). Similarly, the words “include”, “including”, and “includes” mean including but not limited to. To facilitate understanding, like reference numerals have been used, where possible, to designate like elements common to the figures. Optional portions of the figures may be illustrated using dashed or dotted lines, unless the context of usage indicates otherwise.
DETAILED DESCRIPTION
[0023] The following description includes the preferred best mode of one embodiment of the present invention. It will be clear from this description of the invention that the invention is not limited to these illustrated embodiments but that the invention also includes a variety of modifications and embodiments thereto. Therefore, the present description should be seen as illustrative and not limiting. While the invention is susceptible to various modifications and alternative constructions, it should be understood, that there is no intention to limit the invention to the specific form disclosed, but, on the contrary, the invention is to cover all modifications, alternative constructions, and equivalents falling within the scope of the invention as defined in the claims.
[0024] In any embodiment described herein, the open-ended terms "comprising", "comprises”, and the like (which are synonymous with "including", "having” and "characterized by") may be replaced by the respective partially closed phrases "consisting essentially of", “consists essentially of", and the like or the respective closed phrases "consisting of", "consists of”, the like.
[0025] As used herein, the singular forms “a”, “an”, and “the” designate both the singular and the plural, unless expressly stated to designate the singular only.
[0026] FIG. 1A illustrates a block diagram of a smart device 100 designed for child safety and monitoring. The smart device 100 may utilize Internet of Things (IoT) and machine learning-based technologies for real-time monitoring and analysis of an environment of a user. In an embodiment of the present invention, the user may be, but not limited to, a patient, a child, an infant, a toddler, or a common person. Embodiments of the present invention are intended to include or otherwise cover any user of the smart device 100. The smart device 100 may facilitate immediate alerts and updates via a connected mobile application to parents or caretakers to ensure that they are aware of their child's safety status at all times. In an embodiment of the present invention, the smart device 100 may be adapted to continuously monitor the child's surroundings and activities. The smart device 100 may alert parents or caregivers if the user approaches a potentially hazardous object or area. This capability of the smart device 100 may ensure that parents are immediately informed of any potential risks.
[0027] According to embodiments of the present invention, the smart device 100 may comprise a camera 102, a wearable device 104, an ultrasonic sensor 106, a processing unit 108, a communication unit 110, and a buzzer 112.
[0028] The camera 102 may be strategically positioned to capture high-resolution images of the surroundings of a user, providing a clear and comprehensive view of the environment. The camera 102 may be installed at the location, in an embodiment of the present invention. According to embodiments of the present invention, the orientation for the installation of the camera 102 may be, but not limited to, a rooftop, a mast, a cabinet, a cradle, a crib, a bed frame, a door, and so forth. Embodiments of the present invention are intended to include or otherwise cover any orientation for the installation of the camera 102, including known, related art, and/or later developed technologies.
[0029] The camera 102 may also be configured to transmit a live feed of the user’s surroundings to a central monitoring unit (not shown), in an embodiment of the present invention. In an embodiment of the present invention, the central monitoring unit may be configured for continuous monitoring of the live feed of the user’s surroundings. In an embodiment of the present invention, the central monitoring unit may be automated using a computer system. In another embodiment of the present invention, a manual monitoring of the live feed of the user’s surroundings may be done by the parents, the caretakers, or a system administrator.
[0030] According to the other embodiments of the present invention, a resolution for the captured live feed of the user using the camera 102 may be in a range from 320 pixels by 240 pixels to 1920 pixels by 1080 pixels, and so forth. Embodiments of the present invention are intended to include or otherwise cover any resolution for the live feed of the user captured using the camera 102, including known, related art, and/or later developed technologies. According to the other embodiments of the present invention, the camera 102 may be, but not limited to, a still camera, a video camera, a colour balancer camera, a thermal camera, an infrared camera, a telephoto camera, a wide-angle camera, a macro camera, a Close-Circuit Television (CCTV) camera, a web camera, and so forth. Embodiments of the present invention are intended to include or otherwise cover any type of the camera 102, including known, related art, and/or later developed technologies.
[0031] The wearable device 104 may be designed to be comfortably worn by the user. In a preferred embodiment of the present invention, the wearable device 104 may be an Internet of Things (IoT) enabled device. According to embodiments of the present invention, the wearable device 104 may be, but not limited to, a necklace, a band, a headband, an earring, a ring, a wristband, a handheld controller, a glove, and so forth. Embodiments of the present invention are intended to include or otherwise cover any type of the wearable device 104, including known, related art, and/or later developed technologies. The wearable device 104 may be used in a pair of two, in an embodiment of the present invention. The wearable device 104 may be used singularly, in another embodiment of the present invention.
[0032] In an embodiment of the present invention, the wearable device 104 may comprise the ultrasonic sensor 106 and other critical components. The ultrasonic sensor 106 may be integrated into the wearable device 104 to emit ultrasonic waves and measure a bounce-back time of these waves. This measurement allows the smart device 100 to calculate the distance to nearby objects, providing essential data for detecting potential hazards.
[0033] The processing unit 108 may be connected to both the camera 102 and the ultrasonic sensor 106 of the wearable device 104. The processing unit 108 may further be configured to execute computer-executable instructions to generate an output relating to the smart device 100. In an embodiment of the present invention, the processing unit may be configured to perform several critical functions such as the processing unit 108 may receive the captured live feed from the camera 102 and may process images from the captured live feed using a machine learning model to detect harmful objects. The processing unit 108 may generate bounding boxes around detected harmful objects in the images. The processing unit 108 may further calculate the distance to the detected objects based on the ultrasonic sensor's measurements and determine whether the child is in proximity to a danger zone. When the calculated distance to the detected harmful object is less than a programmed threshold, the processing unit 108 may initiate corrective measures.
[0034] In an embodiment of the present invention, the processing unit 108 may be configured to enable a person to adjust the programmed threshold. In another embodiment of the present invention, the processing unit 108 may autonomously adjust the programmed threshold based on contextual factors such as an age of the user, an environmental condition, and so forth. For example, if the user is a young child, the processing unit 108 may set a lower threshold for detecting harmful objects to provide enhanced protection. Conversely, if the user is an adult, the threshold may be set higher to reduce the likelihood of false alarms. Additionally, the processing unit 108 may consider the environmental conditions such as lighting, weather, or location when adjusting the threshold. For instance, in low-light conditions, the threshold may be lowered to improve object detection accuracy. Similarly, in outdoor environments where there may be more potential hazards, the threshold may be adjusted accordingly to ensure proactive safety measures are in place. By dynamically adjusting the programmed threshold based on contextual factors, the smart device 100 may be effectively adapted to varying situations and provide optimized protection for the user.
[0035] In an embodiment of the present invention, the corrective measures may be, but not limited to, actuating a buzzer 112, transmitting notifications to a user device 114 using the communication unit 110, vibrating the wearable device 102 to notify the user directly, and so forth. Embodiments of the present invention are intended to include or otherwise cover any type of the corrective measures including known, related art, and/or later developed technologies.
[0036] According to embodiments of the present invention, the processing unit 108 may be, but not limited to, a Programmable Logic Control (PLC) unit, a micro processing unit, a development board, and so forth. In a preferred embodiment of the present invention, the processing unit 108 may be an Arduino. Embodiments of the present invention are intended to include or otherwise cover any type of the processing unit 108 including known, related art, and/or later developed technologies. In an embodiment of the present invention, the processing unit 108 may further be explained in conjunction with FIG. 2.
[0037] In an embodiment of the present invention, the communication unit 110 may be adapted to transmit real-time live feed and data of the user to the user device 114 for analyzing a well-being of the user. In a preferred embodiment of the present invention, the communication unit 110 may be an Espressif 8266 Wi-Fi module. In another preferred embodiment of the present invention, the communication unit 110 may be Internet of Things (IoT) enabled. Embodiments of the present invention are intended to include or otherwise cover any other communication unit 110 including known, related art, and/or later developed technologies.
[0038] In an embodiment of the present invention, the buzzer 112 may be adapted to alert the user when a harmful object is detected. The buzzer 112 may be designed to produce sound at various volumes and frequencies depending on the severity of the detected threat. The buzzer may be an audio buzzer, a hybrid buzzer, a sound unit, an indicator, and so forth. Embodiments of the present invention are intended to include or otherwise cover any type of buzzer 112, including known, related art, and/or later developed technologies.
[0039] In an embodiment of the present invention, the user device 114 may be connected to the wearable device 104 or the smart device 100 through a wireless communication protocol such as Bluetooth, Wi-Fi, or any other suitable method. The user device 114 may be configured to receive alerts and notifications from the wearable device 104, allowing the parent or the caretaker to monitor the user's environment remotely. Embodiments of the present invention are intended to include or otherwise cover any type of user device 114, including known, related art, and/or later developed technologies.
[0040] FIG. 1B illustrates a prototype of the smart device 100, according to an embodiment of the present invention. In the depicted prototype of the smart device 100, the camera 102 may be arranged to capture the images, and the ultrasonic sensor 106 may be discreetly arranged to measure the bounce-back times of emitted ultrasonic waves.
[0041] FIG. 2 illustrates a block diagram of the processing unit 108 of the smart device 100, according to an embodiment of the present invention. The processing unit 108 may comprise computer-executable instructions in the form of programming modules such as an image receiving module 200, a detection module 202, an analyzing module 204, and an operation module 206.
[0042] In an embodiment of the present invention, the image receiving module 200 may be configured to receive and process the live feed from the camera 102. Upon capturing the live feed, the image receiving module 200 may generate a sensor activation signal, which triggers further processing by other modules within the processing unit 108.
[0043] In an embodiment of the present invention, the detection module 202 may be activated upon receipt of the sensor activation signal from the image receiving module 200. The detection module 202 may actuate the ultrasonic sensor 106 for emitting the ultrasonic waves and measuring the bounce-back time of the emitted ultrasonic waves. Upon measuring the bounce-back time in a definite range, the detection module 202 may generate an analysis signal based on the detection of the objects in the proximity of the user and may transmit the generated analysis signal to the analyzing module 204.
[0044] The analyzing module 204 may be actuated upon receiving the analysis signal from the detection module 202. The analyzing module 204 may utilize a machine learning model to analyze the images from the live feed to detect harmful objects. In an embodiment of the present invention, the machine learning model may be a convolutional neural network (CNN) model. The CNN model may be trained to recognize various harmful objects that could pose a threat to the user. The detection module 202 may further generate the bounding boxes around the detected harmful objects in the images. The bounding boxes may be virtual indicators that highlight the location and extent of the harmful objects within the images. The detected object in the bounding box may be labeled with a class by the detection module 202, such as "sharp object," "toxic substance," or "potential hazard," which helps in identifying the nature of the threat and determining the appropriate corrective measures to be taken. For instance, if a bottle of bleach is detected in the images, the bounding box would be generated around the bottle, and the object would be labeled as a "toxic substance."
[0045] The analyzing module 204 may further process the analysis signal to calculate the distance to the detected objects using measurements from the ultrasonic sensor 106. Additionally, the analyzing module 204 may refine the positioning of the bounding boxes around the detected harmful objects to ensure accuracy in identifying danger zones. The analyzing module 204 determines whether the detected objects pose an immediate danger based on their proximity to the user. If the analyzing module 204 determines that the user is within a danger zone, it may transmit a danger alert signal to the operation module 206.
[0046] In response to receiving the danger alert signal, the operation module 206 may initiate the corrective measures to protect the user. These corrective measures may include actuating a buzzer 112, transmitting notifications to a user device 114 via the communication unit 110, or vibrating the wearable device 104 to directly notify the user. The operation module 206 may ensure that appropriate actions are taken promptly to mitigate any potential danger. In an embodiment of the present invention, the operation module 206 may be configured to perform the corrective measures based on a perceived level of danger based on the detected harmful object. For instance, if the detected object is identified as a sharp object, the operation module 206 may trigger a higher level of alert, such as activating the buzzer at a louder volume or sending urgent notifications to the user device 114. Conversely, if the detected object is identified as a potential hazard with a lower level of threat, the operation module 206 may initiate less intense corrective measures, such as sending a gentle notification to the user device. In an embodiment of the present invention, the operation module 206 may trigger the communication unit 110 for performing the corrective measures by utilizing an Internet of Things (IoT) based technology.
[0047] FIG. 3 depicts a flowchart of a method 300 for child safety and monitoring using the smart device 100, according to an embodiment of the present invention.
[0048] At step 302, the smart device 100 may receive the captured images from the camera 102. The images are continuously fed to the processing unit 108 for analysis.
[0049] At step 304, the smart device 100 may detect the harmful object in the captured images using the convolutional neural network (CNN) model. The CNN model is trained to recognize various harmful objects that could pose the threat to the user.
[0050] At step 306, the smart device 100 may generate the bounding boxes around the detected harmful object in the captured images. These bounding boxes visually indicate the presence and location of harmful objects within the images.
[0051] At step 308, the smart device 100 may calculate the distance to the detected harmful objects based on the bounce-back time measured by the ultrasonic sensor 106. If the calculated distance to the detected harmful object is less than the programmed threshold, then the method 300 may proceed to step 310. Otherwise, the method 300 may revert to step 302 to continue monitoring and processing new images.
[0052] At step 310, the smart device 100 may perform the corrective measures to ensure the child's safety.
[0053] While the invention has been described in connection with what is presently considered to be the most practical and various embodiments, it is to be understood that the invention is not to be limited to the disclosed embodiments, but on the contrary, is intended to cover various modifications and equivalent arrangements included within the scope of the appended claims.
[0054] This written description uses examples to disclose the invention, including the best mode, and also to enable any person skilled in the art to practice the invention, including making and using any devices or systems and performing any incorporated methods. The patentable scope of the invention is defined in the claims, and may include other examples that occur to those skilled in the art. Such other examples are intended to be within the scope of the claims if they have structural elements that do not differ from the literal language of the claims, or if they include equivalent structural elements within substantial differences from the literal languages of the claims. , Claims:CLAIMS
We Claim:
1. A smart device (100) with parental control for child safety and monitoring, comprising:
a camera (102) adapted to capture images; and
an ultrasonic sensor (106) arranged in a wearable device worn by a user, wherein the ultrasonic sensor (106) emits ultrasonic waves and measures a bounce-back time of the emitted ultrasonic waves; and
a processing unit (108) connected to the camera (102), characterized in that the processing unit (108) is configured to:
receive the captured images from the camera (102);
detect a harmful object in the captured images using a machine learning model;
generate bounding boxes around the detected harmful object in the captured images;
calculate a distance to the detected harmful objects based on the bounce-back time measured by the ultrasonic sensor (106); and
perform corrective measures, when the calculated distance to the detected harmful object is less than a programmed threshold.
2. The smart device (100) as claimed in claim 1, wherein the harmful object is selected from a sharp object, a toxic substance, a potential hazard for children, or a combination thereof.
3. The smart device (100) as claimed in claim 1, wherein the corrective measures are selected from actuating a buzzer (112), transmitting notifications to a user device (114), or a combination thereof.
4. The smart device (100) as claimed in claim 1, wherein the detected object in the bounding box is labeled with a class by the processing unit (108).
5. The smart device (100) as claimed in claim 1, wherein the processing unit (108) is configured to perform the corrective measures based on a perceived level of danger based on the detected harmful object.
6. The smart device (100) as claimed in claim 1, wherein the corrective measures are performed by utilizing an Internet of Things (IoT) based technology.
7. The smart device (100) as claimed in claim 1, wherein the processing unit (108) is configured to adjust the programmed threshold based on contextual factors selected from an age of the user, or an environment condition.
8. The smart device (100) as claimed in claim 1, wherein the wearable device (104) is an Internet of Things (IoT) enabled device.
9. The smart device (100) as claimed in claim 1, wherein the machine learning model is a convolutional neural network (CNN) model.
10. A method (300) for child safety and monitoring using a smart device (100), comprising steps of:
receiving captured images from a camera (102);
detecting a harmful object in the captured images using a convolutional neural network (CNN) model;
generating bounding boxes around the detected harmful object in the captured images;
calculating a distance to the detected harmful objects based on a bounce-back time measured by an ultrasonic sensor (106); and
performing corrective measures, when the calculated distance to the detected harmful object is less than a programmed threshold.
Date: May 31, 2024
Place: Noida
Dr. Keerti Gupta
Agent for the Applicant
(IN/PA-1529)
| # | Name | Date |
|---|---|---|
| 1 | 202441042957-STATEMENT OF UNDERTAKING (FORM 3) [03-06-2024(online)].pdf | 2024-06-03 |
| 2 | 202441042957-REQUEST FOR EARLY PUBLICATION(FORM-9) [03-06-2024(online)].pdf | 2024-06-03 |
| 3 | 202441042957-POWER OF AUTHORITY [03-06-2024(online)].pdf | 2024-06-03 |
| 4 | 202441042957-OTHERS [03-06-2024(online)].pdf | 2024-06-03 |
| 5 | 202441042957-FORM-9 [03-06-2024(online)].pdf | 2024-06-03 |
| 6 | 202441042957-FORM FOR SMALL ENTITY(FORM-28) [03-06-2024(online)].pdf | 2024-06-03 |
| 7 | 202441042957-FORM 1 [03-06-2024(online)].pdf | 2024-06-03 |
| 8 | 202441042957-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [03-06-2024(online)].pdf | 2024-06-03 |
| 9 | 202441042957-EDUCATIONAL INSTITUTION(S) [03-06-2024(online)].pdf | 2024-06-03 |
| 10 | 202441042957-DRAWINGS [03-06-2024(online)].pdf | 2024-06-03 |
| 11 | 202441042957-DECLARATION OF INVENTORSHIP (FORM 5) [03-06-2024(online)].pdf | 2024-06-03 |
| 12 | 202441042957-COMPLETE SPECIFICATION [03-06-2024(online)].pdf | 2024-06-03 |
| 13 | 202441042957-FORM-26 [11-07-2024(online)].pdf | 2024-07-11 |