Abstract: The present invention relates to the intelligent toy or device speech interaction with a child. The system of the device comprises at least one microcomputer chip along with at least one camera and at least one speaker. The intelligent toy is adapted to recognize the facial expression of the person (especially kid) and respond accordingly via speech and music. The device includes the Haar-Cascade technique with a Native binary pattern bar graph and a labelled dataset of 60000 images for the design of the facial emotion recognition model.
This invention relates to the intelligent toy with emotions. More particularly, the present invention is related to the machine learning-based intelligent toy that can recognize the emotion of the user and act accordingly.
Background of the Invention:
Playing with toys is required in the daily routine activities of the children for cognitive, social, and physical development. They come in many shapes, sizes, and functionalities. Toys for younger children are often in the form of dolls, trucks, blocks, and buildings. Intelligent toys can boost creativity and inventiveness.
Patent No. CN106941002A designed a voice control intelligent toy for children of rehearsal.
Patent No. CN104538043A designed an application for real-time emotion suggestion in one call.
Patent No. CN109062404B designed an interactive system applied to an intelligent early education machine for children using facial expression recognition.
Patent No. AU2015230802B2 designed an intelligent toy having intelligent elements configured to mimic the appearance of a person, animal, vehicle, or other characters.
Patent No. CN203750179U designed a quadruped voice interaction intelligent toy which comprises a quadruped body composed of movable limb parts and a driving motor.
Patent No. CN107126709A designed an intelligent toy for possessing phonetic function.
Patent No. CN107126708A designed an intelligent toy dog for possessing and illuminating function.
Patent No. CN102527055A designed a voice interaction intelligent toy, which comprised of a toy model which is a moving robot, and a voice interaction intelligent control system.
Patent No. CN101474481B designed an emotion robot system, which can generate human-simulated facial expressions and can interact with people.
Numerous varieties of intelligent toys have been developed for users to date. However, their main focus is on the education of small kids. The present invention provides an intelligent toy which not only recognizes the emotions of a person (especially kid) but also responds accordingly. It tries to soothe their emotions if they are either sad or angry.
In the current situation of the Covid-19 pandemic, during the lockdown, everyone was locked inside their homes. Kids were not going to school or parks. Moreover, these days, both parents are busy and kids have either no or only one sibling. They need a companion which understands their basic emotions and helps them to soothe them. In such scenario, the present invention can be proved as a very effective tool to enhance the mood of a person (especially kid) dealing with sadness and stress by playing a song. It can reduce negativity around and can reduce suicide cases too.
Toys are available in many shapes, sizes, and functionalities in the market. Toys for younger children are often in the form of dolls, trucks, blocks, and buildings, to name just a few. Toys for older children, young adults, and adults can be more sophisticated, complex, and intriguing. However, toys historically could not respond interactively and autonomously to the kid who is playing with the toy. Therefore, a need exists for interactive and autonomous toy responses to increase play enjoyment, value, and interest. The present invention is an intelligent toy that is capable to recognize the emotions of a person and act accordingly. For instance, if the person (especially kid) who is holding the toy is sad, it will try to regulate his emotion by playing music.
Object of the invention:
The primary object of the present invention is to provide an intelligent toy that can recognize child facial emotion by using machine-learning based approach such as Haar-Cascade classifier and a Native binary pattern bar graph.
Another object of the present invention is to provide an appropriate action based on recognized facial emotions.
Additional object is to regulate the emotions of a person (especially kid) by playing music and provide him a companion.
Summary of the invention:
The following summary is provided to facilitate an understanding of some of the innovative features unique to the disclosed embodiments and is not intended for the full description of the invention. A full appreciation of the various aspects of the preferred embodiments disclosed herein can be gained by taking the entire specification, claims, drawings, and abstract as a whole.
The present invention includes responses to the facial expressions of a person (especially kid). The present invention processes via a processor and associated memory. Based on the input and the computer program module(s), the present invention then presents an output in the form of speech and music to the user.
Additional features and advantages of the used methodology of the presented invention provide an emotion-recognizing intelligent toy that takes action based upon the current emotional state of the person (especially kid).
Other features and advantages of the present invention will become more apparent from the following detailed description, taken in conjunction with the accompanying drawings which illustrate, by way of example, the principles of the invention.
Brief description of the accompanying drawings:
The drawings described herein are for illustrative purposes only of selected embodiments and not all possible implementations and are not intended to limit the scope of the present disclosure. The invention itself, however, both as to organization and method of operation, may best be understood by reference to the detailed description which follows taken in conjunction with the accompanying drawings in which:
Figure- 1 shows the block diagram of the toy, according to an embodiment of the present invention;
Figure- 2 is a schematic representation of the toy, according to an embodiment of the present invention;
Figure- 3 shows the working mechanism of the toy, according to an embodiment of the present invention;
Figure- 4 shows the working of the Machine Learning based Haar Cascade based classifier of the toy, according to an embodiment of the present invention.
Detailed description of the invention:
The preferred embodiments will now be described more fully with reference to the accompanying drawings. The embodiments are provided so that this disclosure will be thorough, and will fully convey the scope to those who are skilled in the art. Numerous specific details are set forth as specific components, devices, and methods, to provide a thorough understanding of preferred embodiments of the present disclosure. It will be apparent to those skilled in the art that specific details need not be employed, that preferred embodiments may be embodied in many different forms, and that neither should be construed to limit the scope of the disclosure. In some example embodiments, well-known processes, well-known device structures, and well-known technologies are not described in detail.
Definitions of the used terms: The following definitions should be interpreted in consideration of the disclosure and for a better understanding of the present invention to a person skilled in the art. In the complete disclosure of the present invention, certain scientific and electrical terms are appropriately used, having the same meaning as practically and theoretically understood by an ordinary person skilled in the art.
Haar-Cascade classifier: The term “Haar-Cascade classifier” used herein refers to a machine learning-based approach where a lot of positive and negative images are used to train the classifier.
Microcomputer: The term “Microcomputer” used herein refers to the self-contained computer-on-a-chip that has a microprocessor in it.
According to one embodiment, an intelligent toy or device for speech interaction with a child comprises at least one microcomputer chip along with at least one camera and at least one speaker. In describing the preferred embodiments of the present invention, reference will be made herein to Figures 1, 2, 3, and 4 of the drawing which explains the features of the invention.
The present invention can interact with the children in the same manner as they are taught by humans. The whole circuit is fitted inside the toy in the form of a system using one or more program modules. The intelligent toy receives input from a child in the form of facial expressions.
The architecture and techniques as shown in Figure- 1, of the toy is directed to respond to the child's inputs with, after processing, various outputs appropriate in terms of as audio responses. In this way, the method, system, and/or apparatus of the present invention interact with child and provide a companion to regulate his emotions.
A preferred embodiment of the present invention comprises a microcomputer chip along with the camera and speaker. By referring to Figure- 2, the circuit and power adapter are installed inside the toy and for capturing the facial images; cameras are fitted in the eyes of the toys.
In a preferred embodiment of the present invention, the working mechanism of the toy as shown in Figure- 3, the microcomputer chip executes the stored instructions to recognize the facial expressions of the child; by receiving input from the child via the camera, the toy interprets the facial expressions of the child using analysis of the given input by machine learning based techniques, the device represents output in the form of a speech response to the child.
Example of the device and person (especially kid) interaction:
The camera captures the facial image of the person (especially kid) who is playing with the toy. That image is passed on to the microcomputer chip (Raspberry Pi 3 B+). The Haar Cascade model for person (especially kid) facial emotion recognition is trained and installed on the microcomputer. As it gets the facial image, it recognizes the emotions and takes the appropriate action. Such as, if he is sad, it says that “hey baby! Why are you sad? Let me play a song for you” and it plays a song. If he is angry, it says that “You look cute while smiling, smile pls. Don’t get angry. Let me play a song for you” and it plays a song. If he is happy or neutral, it will remain silent.
In a preferred embodiment of the present invention, Figure- 4 illustrates the intelligent toy, where a person (especially kid) can interact for play or monitoring. The Haar-Cascade technique with a Native binary pattern bar graph and a labeled dataset of 60000 images for the design of the facial emotion recognition model is used for the operation of the system. The dataset contains images of all age groups collected from Google and other sources. Normally, a kid expresses the basic emotions (Happy, Angry, Sad, and Neutral) more often and the present invention has used these emotions as an input to make up his mind.
Although a preferred embodiment of the invention has been illustrated and described, it will at once be apparent to those skilled in the art that the invention includes advantages and features over and beyond the specific illustrated construction. Accordingly, it is intended that the scope of the invention be limited solely by the scope of the hereinafter appended claims, and not by the foregoing specification when interpreted in light of the relevant prior art.
We claim:
1. A device for speech interaction with a child, the device comprising:
at least one camera;
at least one speaker; and
at least one microcomputer chip for executing stored instructions for:
- recognizing the facial expressions of the child,
- receiving inputs from the child via the camera.
-output a speech and music response to the child based on the interpreted facial expressions,
2. The device as claimed in claim 1, wherein the microcomputer chip is Raspberry Pi 3 B+.
| # | Name | Date |
|---|---|---|
| 1 | 202111051272-Correspondence to notify the Controller [14-01-2025(online)].pdf | 2025-01-14 |
| 1 | 202111051272-POWER OF AUTHORITY [09-11-2021(online)].pdf | 2021-11-09 |
| 1 | 202111051272-REQUEST FOR ADJOURNMENT OF HEARING UNDER RULE 129A [18-10-2024(online)].pdf | 2024-10-18 |
| 1 | 202111051272-Written submissions and relevant documents [30-01-2025(online)].pdf | 2025-01-30 |
| 2 | 202111051272-US(14)-HearingNotice-(HearingDate-22-10-2024).pdf | 2024-09-27 |
| 2 | 202111051272-US(14)-ExtendedHearingNotice-(HearingDate-16-01-2025)-1500.pdf | 2024-12-04 |
| 2 | 202111051272-FORM FOR SMALL ENTITY(FORM-28) [09-11-2021(online)].pdf | 2021-11-09 |
| 2 | 202111051272-FORM 3 [24-01-2025(online)].pdf | 2025-01-24 |
| 3 | 202111051272-CLAIMS [29-10-2022(online)].pdf | 2022-10-29 |
| 3 | 202111051272-FORM 1 [09-11-2021(online)].pdf | 2021-11-09 |
| 3 | 202111051272-PETITION UNDER RULE 137 [24-01-2025(online)].pdf | 2025-01-24 |
| 3 | 202111051272-REQUEST FOR ADJOURNMENT OF HEARING UNDER RULE 129A [18-10-2024(online)].pdf | 2024-10-18 |
| 4 | 202111051272-CORRESPONDENCE [29-10-2022(online)].pdf | 2022-10-29 |
| 4 | 202111051272-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [09-11-2021(online)].pdf | 2021-11-09 |
| 4 | 202111051272-FORM-26 [15-01-2025(online)].pdf | 2025-01-15 |
| 4 | 202111051272-US(14)-HearingNotice-(HearingDate-22-10-2024).pdf | 2024-09-27 |
| 5 | 202111051272-EDUCATIONAL INSTITUTION(S) [09-11-2021(online)].pdf | 2021-11-09 |
| 5 | 202111051272-DRAWING [29-10-2022(online)].pdf | 2022-10-29 |
| 5 | 202111051272-Correspondence to notify the Controller [14-01-2025(online)].pdf | 2025-01-14 |
| 5 | 202111051272-CLAIMS [29-10-2022(online)].pdf | 2022-10-29 |
| 6 | 202111051272-US(14)-ExtendedHearingNotice-(HearingDate-16-01-2025)-1500.pdf | 2024-12-04 |
| 6 | 202111051272-FER_SER_REPLY [29-10-2022(online)].pdf | 2022-10-29 |
| 6 | 202111051272-DRAWINGS [09-11-2021(online)].pdf | 2021-11-09 |
| 6 | 202111051272-CORRESPONDENCE [29-10-2022(online)].pdf | 2022-10-29 |
| 7 | 202111051272-REQUEST FOR ADJOURNMENT OF HEARING UNDER RULE 129A [18-10-2024(online)].pdf | 2024-10-18 |
| 7 | 202111051272-FORM 3 [29-10-2022(online)].pdf | 2022-10-29 |
| 7 | 202111051272-DRAWING [29-10-2022(online)].pdf | 2022-10-29 |
| 7 | 202111051272-DECLARATION OF INVENTORSHIP (FORM 5) [09-11-2021(online)].pdf | 2021-11-09 |
| 8 | 202111051272-COMPLETE SPECIFICATION [09-11-2021(online)].pdf | 2021-11-09 |
| 8 | 202111051272-FER_SER_REPLY [29-10-2022(online)].pdf | 2022-10-29 |
| 8 | 202111051272-OTHERS [29-10-2022(online)].pdf | 2022-10-29 |
| 8 | 202111051272-US(14)-HearingNotice-(HearingDate-22-10-2024).pdf | 2024-09-27 |
| 9 | 202111051272-CLAIMS [29-10-2022(online)].pdf | 2022-10-29 |
| 9 | 202111051272-FER.pdf | 2022-05-04 |
| 9 | 202111051272-FORM 3 [29-10-2022(online)].pdf | 2022-10-29 |
| 9 | 202111051272-FORM-9 [17-01-2022(online)].pdf | 2022-01-17 |
| 10 | 202111051272-CORRESPONDENCE [29-10-2022(online)].pdf | 2022-10-29 |
| 10 | 202111051272-FORM 18 [17-01-2022(online)].pdf | 2022-01-17 |
| 10 | 202111051272-OTHERS [29-10-2022(online)].pdf | 2022-10-29 |
| 11 | 202111051272-DRAWING [29-10-2022(online)].pdf | 2022-10-29 |
| 11 | 202111051272-FER.pdf | 2022-05-04 |
| 11 | 202111051272-FORM-9 [17-01-2022(online)].pdf | 2022-01-17 |
| 12 | 202111051272-COMPLETE SPECIFICATION [09-11-2021(online)].pdf | 2021-11-09 |
| 12 | 202111051272-FER_SER_REPLY [29-10-2022(online)].pdf | 2022-10-29 |
| 12 | 202111051272-FORM 18 [17-01-2022(online)].pdf | 2022-01-17 |
| 12 | 202111051272-OTHERS [29-10-2022(online)].pdf | 2022-10-29 |
| 13 | 202111051272-DECLARATION OF INVENTORSHIP (FORM 5) [09-11-2021(online)].pdf | 2021-11-09 |
| 13 | 202111051272-FORM 3 [29-10-2022(online)].pdf | 2022-10-29 |
| 13 | 202111051272-FORM-9 [17-01-2022(online)].pdf | 2022-01-17 |
| 14 | 202111051272-OTHERS [29-10-2022(online)].pdf | 2022-10-29 |
| 14 | 202111051272-FER_SER_REPLY [29-10-2022(online)].pdf | 2022-10-29 |
| 14 | 202111051272-DRAWINGS [09-11-2021(online)].pdf | 2021-11-09 |
| 14 | 202111051272-COMPLETE SPECIFICATION [09-11-2021(online)].pdf | 2021-11-09 |
| 15 | 202111051272-DECLARATION OF INVENTORSHIP (FORM 5) [09-11-2021(online)].pdf | 2021-11-09 |
| 15 | 202111051272-DRAWING [29-10-2022(online)].pdf | 2022-10-29 |
| 15 | 202111051272-EDUCATIONAL INSTITUTION(S) [09-11-2021(online)].pdf | 2021-11-09 |
| 15 | 202111051272-FER.pdf | 2022-05-04 |
| 16 | 202111051272-CORRESPONDENCE [29-10-2022(online)].pdf | 2022-10-29 |
| 16 | 202111051272-DRAWINGS [09-11-2021(online)].pdf | 2021-11-09 |
| 16 | 202111051272-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [09-11-2021(online)].pdf | 2021-11-09 |
| 16 | 202111051272-FORM 18 [17-01-2022(online)].pdf | 2022-01-17 |
| 17 | 202111051272-CLAIMS [29-10-2022(online)].pdf | 2022-10-29 |
| 17 | 202111051272-EDUCATIONAL INSTITUTION(S) [09-11-2021(online)].pdf | 2021-11-09 |
| 17 | 202111051272-FORM 1 [09-11-2021(online)].pdf | 2021-11-09 |
| 17 | 202111051272-FORM-9 [17-01-2022(online)].pdf | 2022-01-17 |
| 18 | 202111051272-COMPLETE SPECIFICATION [09-11-2021(online)].pdf | 2021-11-09 |
| 18 | 202111051272-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [09-11-2021(online)].pdf | 2021-11-09 |
| 18 | 202111051272-FORM FOR SMALL ENTITY(FORM-28) [09-11-2021(online)].pdf | 2021-11-09 |
| 18 | 202111051272-US(14)-HearingNotice-(HearingDate-22-10-2024).pdf | 2024-09-27 |
| 19 | 202111051272-REQUEST FOR ADJOURNMENT OF HEARING UNDER RULE 129A [18-10-2024(online)].pdf | 2024-10-18 |
| 19 | 202111051272-POWER OF AUTHORITY [09-11-2021(online)].pdf | 2021-11-09 |
| 19 | 202111051272-FORM 1 [09-11-2021(online)].pdf | 2021-11-09 |
| 19 | 202111051272-DECLARATION OF INVENTORSHIP (FORM 5) [09-11-2021(online)].pdf | 2021-11-09 |
| 20 | 202111051272-US(14)-ExtendedHearingNotice-(HearingDate-16-01-2025)-1500.pdf | 2024-12-04 |
| 20 | 202111051272-FORM FOR SMALL ENTITY(FORM-28) [09-11-2021(online)].pdf | 2021-11-09 |
| 20 | 202111051272-DRAWINGS [09-11-2021(online)].pdf | 2021-11-09 |
| 21 | 202111051272-POWER OF AUTHORITY [09-11-2021(online)].pdf | 2021-11-09 |
| 21 | 202111051272-EDUCATIONAL INSTITUTION(S) [09-11-2021(online)].pdf | 2021-11-09 |
| 21 | 202111051272-Correspondence to notify the Controller [14-01-2025(online)].pdf | 2025-01-14 |
| 22 | 202111051272-FORM-26 [15-01-2025(online)].pdf | 2025-01-15 |
| 22 | 202111051272-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [09-11-2021(online)].pdf | 2021-11-09 |
| 23 | 202111051272-FORM 1 [09-11-2021(online)].pdf | 2021-11-09 |
| 23 | 202111051272-PETITION UNDER RULE 137 [24-01-2025(online)].pdf | 2025-01-24 |
| 24 | 202111051272-FORM 3 [24-01-2025(online)].pdf | 2025-01-24 |
| 24 | 202111051272-FORM FOR SMALL ENTITY(FORM-28) [09-11-2021(online)].pdf | 2021-11-09 |
| 25 | 202111051272-POWER OF AUTHORITY [09-11-2021(online)].pdf | 2021-11-09 |
| 25 | 202111051272-Written submissions and relevant documents [30-01-2025(online)].pdf | 2025-01-30 |
| 1 | SearchStrategyE_02-05-2022.pdf |