Abstract: The present disclosure pertains to an ear recognition and authentication device including one or more earbuds (102) adapted to be worn by an entity, a scanner (106), and a processor (108) operatively coupled to the scanner (106). The scanner (106) is configured to scan one or more images of ear surface associated with an entity, and correspondingly generate a first set of signals. The processor (108) is configured to identify a set of ear points from the first set of signals, authenticate the identified set of ear points with pre-stored ear points associated with the entity, and generate a set of verification signals, where the set of verification signals facilitate adjusting the one or more earbuds (102) inside one or more earholes of the entity in response to the authenticated set of ear points and enables providing grip to the one or more earbuds (102) on the ear surface.
Claims:1. An ear recognition and authentication device (100) comprising
one or more earbuds (102) adapted to be worn by an entity;
a scanner (106) configured with each of the one or more ear buds to scan one or more images of ear surface associated with an entity, and correspondingly generate a first set of signals;
a processor (108) operatively coupled to the scanner (106) , wherein the processor (108) includes a memory, the memory storing instructions executable by the processor to:
identify a set of ear points from the first set of signals;
train and test the set of ear points based on the identified set of ear points;
authenticate the identified set of ear points with a dataset, wherein the dataset includes pre-stored ear points associated with the entity, and
generate a set of verification signals and transmit the set of verification signals to the one or more earbuds (102), wherein the set of verification signals facilitate adjusting the one or more earbuds (102) inside one or more earholes of the entity in response to the authenticated set of ear points and enables providing grip to the one or more earbuds (102) on the ear surface.
2. The device (100) as claimed in claim 1, wherein the scanner (106) includes any or a combination of camera, and image scanner.
3. The device (100) as claimed in claim 1, wherein the authentication device (100) is wearable and adapted to be worn by the entity, and wherein the authentication device (100) is selected from group including earphone, headphone, earpods, and earpiece.
4. The device (100) as claimed in claim 1, wherein the processor (108) is configured to generate a set of warning signals upon negative verification of the identified set of ear points, wherein the negative verification pertains to unmatching of the identified set of ear points with the dataset.
5. The device (100) as claimed in claim 4, wherein the authentication device (100) includes an alert unit (110) operatively coupled with the processor (108), wherein the alert unit (110) is configured with one or more earbuds (102), and wherein the alert unit (110) includes any or a combination of light emitting diode, buzzer, and alarm.
6. The device (100) as claimed in claim 1, wherein each of the one or more earbuds (102) include a set of sensors (104) configured to sense the ear surface associated with the entity, and correspondingly generate a second set of signals, wherein the second set of signals are transmitted to the processor (108), wherein the processor (108) facilitates activating the authentication device (100) in response to the received second set of signals.
7. The device (100) as claimed in claim 6, wherein the set of sensors (104) include any or a combination of piezoelectric sensor, force sensor, and pressure sensor.
8. An ear recognition and authentication method (400) comprising:
scanning, at a processor (108), one or more images of ear surface associated with an entity through a scanner (106), wherein the scanner (106) is configured with each of one or more earbuds (102), wherein the one or more earbuds (102) are adapted to be worn by the entity, and wherein after scanning, the scanner (106) correspondingly generate a first set of signals;
identifying, at the processor (108), a set of ear points from the first set of signals;
training and testing, at the processor, the set of ear points based on the identified set of ear points;
authenticating , at the processor (108), the identified set of ear points with a dataset, wherein the dataset includes pre-stored ear points associated with the entity, and
generating, at the processor (108), a set of verification signals and transmitting the set of verification signals to the one or more earbuds (102), wherein the set of verification signals facilitate adjusting the one or more earbuds (102) inside one or more earholes of the entity in response to the authenticated set of ear points and enables providing grip to the one or more earbuds (102) on the ear surface.
Description:TECHNICAL FIELD
[0001] The present disclosure relates generally to field of wearables. More particularly, the present disclosure provides an ear recognition and authentication device and method.
BACKGROUND
[0002] Background description includes information that may be useful in understanding the present invention. It is not an admission that any of the information provided herein is prior art or relevant to the presently claimed invention, or that any publication specifically or implicitly referenced is prior art.
[0003] Wearables can include wide variety for ears, eyes, wrist, and the like. Authentication of wearables is also necessary to ensure safety and proper working of the wearables. However, a significant Ear - Wearables Authentication System for Ear-Wearables can be either second device dependent or else works on PIN or PASSWORD. Present available Ear-Wearables are not good fit to ears. Entity needs to adjust the Ear-Wearables according to fitting and ear surface grip. Also, the second device dependency for authentication is tedious and requires extra effort, and capital. Therefore, a system which incorporates both the
[0004] Existing solutions can include providing ear-bones print scanner. However, such solution can be uncomfortable. Other solutions can include changing shape and size of the Ear-Wearables to adjust the Ear-Wearables inside one or more earholes of an entity. Also, pre-installed ear recognition feature to authenticate and match the ear is required. Automatic adjustment of the Ear-Wearables is required without changing shape and size of the Ear-Wearables.
[0005] There is a need to overcome mentioned problem of prior art by bringing a solution that facilitate providing automatic adjustment of the Ear-Wearables without changing shape and size of the Ear-Wearables. The automatic adjustment of Ear-Wearables according to ear’s hole fit good to the entity’s ear. The solution also authenticate and match the ear and hence when matched only then can be used.
OBJECTS OF THE PRESENT DISCLOSURE
[0006] Some of the objects of the present disclosure, which at least one embodiment herein satisfies are as listed herein below.
[0007] It is an object of the present disclosure to provide an ear recognition and authentication device with pre-installed ear recognition feature to authenticate and match the ear and hence when matched only then can be used.
[0008] It is an object of the present disclosure to provide an ear recognition and authentication device that facilitates adjusting earpiece according to one or more earholes of entity.
[0009] It is an object of the present disclosure to provide an ear recognition and authentication device where machine learning models along with robotic auto mechanics facilitate fitting the device into ear hole of entity
[0010] It is an object of the present disclosure to provide an ear recognition and authentication device with great grip, easy to wear, comfortable and can be used by any age group.
[0011] It is an object of the present disclosure to provide an ear recognition and authentication device that help in adjusting the device into earholes of the entity without changing shape and size of the device.
SUMMARY
[0012] The present disclosure relates generally to field of wearables. More particularly, the present disclosure provides an ear recognition and authentication device and method.
[0013] An aspect of the present disclosure pertains to an ear recognition and authentication device including one or more ear buds, a scanner, a processor, and a set of sensors. The one or more ear buds may be adapted to be worn by an entity. The scanner may be configured with each of the one or more ear buds to scan one or more images of ear surface associated with an entity and correspondingly generate a first set of signals. The processor may be operatively coupled to the scanner, where the processor may includes a memory, the memory storing instructions executable by the processor. The processor may be configured to identify a set of ear points from the first set of signals, train and test the set of ear points based on the identified set of ear points. The processor may be configured to authenticate the identified set of ear points with a dataset, where the dataset may include pre-stored ear points associated with the entity. The processor may be configured to generate a set of verification signals and transmit the set of verification signals to the one or more ear buds, where the set of verification signals may facilitate adjusting the one or more ear buds inside ear hole of the entity in response to the authenticated set of ear points and enables providing grip to the one or more ear buds on the ear surface.
[0014] In an aspect, the scanner may include any or a combination of camera, and image scanner.
[0015] In an aspect, the device may be wearable and adapted to be worn by the entity, and where the device may be selected from group including earphone, headphone, earpods, and earpiece.
[0016] In an aspect, the processor may be configured to generate a set of warning signals upon negative verification of the identified set of ear points, where the negative verification may pertain to unmatching of the identified set of ear points with the dataset.
[0017] In an aspect, the device may include an alert unit operatively coupled with the processor, where the alert unit may be configured with one or more ear buds, and where the alert unit may include any or a combination of light emitting diode, buzzer, and alarm.
[0018] In an aspect, each of the one or more ear buds may include a set of sensors configured to sense the ear surface associated with the entity, and correspondingly generate a second set of signals, where the second set of signals may be transmitted to the processor, and where the processor may facilitate activating the device in response to the received second set of signals.
[0019] In an aspect, the set of sensors may include any or a combination of piezoelectric sensor, force sensor, and pressure sensor.
[0020] Another aspect of the present disclosure pertains to an ear recognition and authentication method including scanning, at a processor, one or more images of ear surface associated with an entity through a scanner, where the scanner may be configured with each of one or more ear buds. The one or more ear buds may be adapted to be worn by the entity, and where after scanning, the scanner may correspondingly generate a first set of signals. The method may include identifying, at a processor, a set of ear points from the first set of signals, training and testing, at the processor, the set of ear points based on the identified set of ear points. The method may include authenticating, at the processor, the identified set of ear points with a dataset, where the dataset may include pre-stored ear points associated with the entity, and generating, at a processor, a set of verification signals and transmitting the set of verification signals to the one or more ear buds. The set of verification signals may facilitate adjusting the one or more ear buds inside ear hole of the entity in response to the authenticated set of ear points and enables providing grip to the one or more ear buds on the ear surface.
BRIEF DESCRIPTION OF THE DRAWINGS
[0021] The accompanying drawings are included to provide a further understanding of the present disclosure, and are incorporated in and constitute a part of this specification. The drawings illustrate exemplary embodiments of the present disclosure and, together with the description, serve to explain the principles of the present disclosure.
[0022] The diagrams are for illustration only, which thus is not a limitation of the present disclosure, and wherein:
[0023] FIG. 1 illustrates a block diagram of proposed ear recognition and authentication device, in accordance with an embodiment of the present disclosure.
[0024] FIG. 2 illustrates exemplary functional components of the processing unit of the proposed ear recognition and authentication device, in accordance with an embodiment of the present disclosure.
[0025] FIG. 3 illustrates an exemplary view of the proposed ear recognition and authentication device, in accordance with an embodiment of the present disclosure.
[0026] FIG. 4 illustrates an exemplary method of proposed ear recognition and authentication device, in accordance with an embodiment of the present disclosure.
DETAIL DESCRIPTION
[0027] In the following description, numerous specific details are set forth in order to provide a thorough understanding of embodiments of the present invention. It will be apparent to one skilled in the art that embodiments of the present invention may be practiced without some of these specific details.
[0028] The present disclosure relates generally to field of wearables. More particularly, the present disclosure provides an ear recognition and authentication device and method.
[0029] FIG. 1 illustrates a block diagram of proposed ear recognition and authentication device, in accordance with an embodiment of the present disclosure.
[0030] As illustrated in FIG. 1, the proposed ear recognition and authentication device (100) (also referred to as device (100), herein) can include one or more earbuds (102), a set of sensors (104), a scanner (106), a processor (108) (interchangeably referred to as processing unit (108), herein), and an alert unit (110). In an embodiment, the device (100) can be wearable, where the device (100) can be selected from group including earpods, earphone, but not limited to the like. The device (100) can help an entity in adjusting the earpods as per convenience of the entity and according to ear bone and ear shape, the device (100) can be adjusted and gripped with ear surface of the entity.
[0031] In an illustrative embodiment, the one or more ear buds (102) can be adapted to be worn by the entity. In another illustrative embodiment, the scanner (106) can be configured with each of the one or more ear buds (102) to scan one or more images of ear surface associated with an entity and correspondingly generate a first set of signals. In yet another illustrative embodiment, the first set of signals generated by the scanner (106) can be in digital form or machine readable form, where the first set of signals can be transmitted to the processing unit (108).
[0032] In an illustrative embodiment, the processing unit (108) can be operatively coupled to the scanner (106), where the processing unit (108) can include one or more processors with a memory, the memory storing instructions executable by the processor. In another illustrative embodiment, the first set of signals can be received by the processing unit (108). The processing unit (108) can facilitate adjusting the one or more ear buds on the ear surface with help of machine learning model and algorithm, where the processing unit (108) can be configured with the machine learning models and algorithm.
[0033] In an illustrative embodiment, the scanner (106) can include any or a combination of camera, image scanner, and the like. In another illustrative embodiment, the set of sensors (104) can include any or a combination of piezoelectric sensor, force sensor, pressure sensor, and the like. Each of the one or more ear buds (102) can include a set of sensors (104) configured to sense the ear surface associated with the entity, and correspondingly generate a second set of signals, where the second set of signals can be transmitted to the processing unit (108), where the processing unit (108) can facilitate activating the device (100) in response to the received second set of signals.
[0034] In an illustrative embodiment, the device (100) can include an alert unit (110) operatively coupled with the processing unit (108), where the alert unit (110) can be configured with one or more ear buds (102), and where the alert unit (110) can include any or a combination of light emitting diode, buzzer, alarm, and the like. In another illustrative embodiment, the processing unit (108) can be configured to generate a set of warning signals and transmit the set of warning signals to the one or more earbuds (102), when the identified set of ear points are not authenticated by the processing unit (108). The alert unit (110) can facilitate alerting the entity upon unauthentication of the one or more ear points.
[0035] In an illustrative embodiment, the processing unit (108) can facilitate identification of unique one or more ear points on the ear surface which helps in recognition of ears associated with the entity. In another illustrative embodiment, the processing unit (108) can enable training a machine learning model for one or more ear data-points recognition in three dimensional mode. Once the one or more ear data points are recognized by the machine learning model, the processing unit (108) can facilitate authentication of the recognized one or more ear data points for safety of the device (100). In yet another illustrative embodiment, after authentication, using machine learning model, camera (106) and set of sensors (104) can help in making the device (100) to adjust and fit inside ear hole to provide strong grip to entity’s ear and better fitting to the entity.
[0036] In an illustrative embodiment, the one or more earbuds (102) can have sound output, where the camera (106) and the set of sensors (104) can facilitate recognition of the set of ear points recognition. In another illustrative embodiment, the ear associated with the entity can be parallel to the camera (106) to scan before wearing. The device (100) like earphone or earpods does not changes shape or structure while wearing the device (100), nor does the device shrinks or expands. In yet another illustrative embodiment, the device (100) can be adjusted on ear surface and fit inside ear hole of the entity with help of robotic auto mechanics. The device (100) can be adjusted according to place of ear hole of the entity and does not adjust depending on outer ear shape.
[0037] FIG. 2 illustrates exemplary functional components of the processing unit of the proposed ear recognition and authentication device, in accordance with an embodiment of the present disclosure.
[0038] As illustrated in an embodiment, the processing unit (108) can include one or more processor(s) (202). The one or more processor(s) (202) can be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, logic circuitries, and/or any devices that manipulate data based on operational instructions. Among other capabilities, the one or more processor(s) (202) are configured to fetch and execute computer-readable instructions stored in a memory (204) of the processing unit (108). The memory (204) can store one or more computer-readable instructions or routines, which may be fetched and executed to create or share the data units over a network service. The memory (204) can include any non-transitory storage device including, for example, volatile memory such as RAM, or non-volatile memory such as EPROM, flash memory, and the like.
[0039] In an embodiment, the processing unit (108) can also include an interface(s) (206). The interface(s) (206) may include a variety of interfaces, for example, interfaces for data input and output devices, referred to as I/O devices, storage devices, and the like. The interface(s) (206) may facilitate communication of the processing unit (108) with various devices coupled to the processing unit (108). The interface(s) (206) may also provide a communication pathway for one or more components of processing unit (108). Examples of such components include, but are not limited to, processing engine(s) (208) and database (210).
[0040] In an embodiment, the processing engine(s) (208) can be implemented as a combination of hardware and programming (for example, programmable instructions) to implement one or more functionalities of the processing engine(s) (208). In examples described herein, such combinations of hardware and programming may be implemented in several different ways. For example, the programming for the processing engine(s) (208) may be processor executable instructions stored on a non-transitory machine-readable storage medium and the hardware for the processing engine(s) (208) may include a processing resource (for example, one or more processors), to execute such instructions. In the present examples, the machine-readable storage medium may store instructions that, when executed by the processing resource, implement the processing engine(s) (208). In such examples, the processing unit (108) can include the machine-readable storage medium storing the instructions and the processing resource to execute the instructions, or the machine-readable storage medium may be separate but accessible to processing unit (108) and the processing resource. In other examples, the processing engine(s) 208 may be implemented by electronic circuitry. A database (210) can include data that is either stored or generated as a result of functionalities implemented by any of the components of the processing engine(s) (208).
[0041] In an embodiment, the processing engine(s) (208) can include an identification unit (212), a training and testing unit (214), an authentication unit (216), a signal generation unit (218), and other unit(s) (220). The other unit(s) (220) can implement functionalities that supplement applications or functions performed by the device (100) or the processing engine(s) (208).
[0042] The database (210) can include data that is either stored or generated as a result of functionalities implemented by any of the components of the processing engine(s) (208).
[0043] It would be appreciated that units being described are only exemplary units and any other unit or sub-unit may be included as part of the device (100). These units too may be merged or divided into super- units or sub-units as may be configured.
[0044] As illustrated in FIG. 2, the processing unit (108) can be configured to receive a first set of signals from a scanner (106) in machine readable form or binary form, where the first set of signals are transmitted to an identification unit (212). The identification unit (202) includes an extraction unit, where the extraction unit facilitate extracting a set of ear points from one or more images of ear surface captured by the scanner (106) and the identification unit (212) can facilitate identification of the set of ear points from the captured one or more images of ear surface associated with an entity.
[0045] In an illustrative embodiment, the scanner (106) can be camera, and the like, where the camera can be configured to capture one or more images of the ear surface associated with the entity, and the extraction unit can extract the set of ear points from the one or more images of the ear surface, and transmit the extracted set of ear points to the training and testing unit (204). In another illustrative embodiment, the training and testing unit (204) with help of the machine learning model can create ear points dataset according to the extracted set of ear points, where the created ear points dataset can be stored in the database (210).
[0046] In an illustrative embodiment, after identification of the set of ear points from the captured one or more images of ear surface, the training and testing unit (204) can facilitate training and testing the identified set of ear points. In another illustrative embodiment, the training and testing unit (214) can be configured with machine learning model and algorithm to facilitate training of the identified set of ear points.
[0047] In an illustrative embodiment, after the training of the identified set of ear points through the machine learning algorithm and model, the authentication unit (216) can be configured to verify the identified set of ear points with a dataset, where the dataset can include pre-stored ear points associated with the entity, and where the dataset can be stored in the database (210). In another illustrative embodiment, the authentication unit (216) can be configured to match the identified set of ear points with the pre-stored ear points associated with the entity and accordingly transmit the authenticated set of ear points to the signal generation unit (218).
[0048] In an illustrative embodiment, the signal generation unit (218) can be configured to receive the authenticated set of ear points and generate a set of verification signals upon matching of the identified set of ear points with the pre-stored ear points. In another illustrative embodiment, the set of verification signals can be transmitted to one or more earbuds (102) and accordingly the one or more earbuds (102) can be adjusted to fit inside one or more ear holes of the entity and facilitates providing grip to ear surface of the entity for the one or more earbuds (102). In yet another illustrative embodiment, the signal generation unit (218) can be configured to generate a set of warning signals when the identified set of ear points are not matched with the pre-stored ear points and helps in alerting the entity. The set of alert signals can be transmitted to an alert unit (110), where the alert unit (110) can be operatively coupled to the one or more earbuds (102).
[0049] In an illustrative embodiment, the set of verification signals can facilitate adjusting the one or more earbuds (102) inside one or more earholes of the entity in response to the authenticated set of ear points and enables providing grip to the one or more earbuds (102) on the ear surface.
[0050] In an illustrative embodiment, the other unit(s) (220) can include activation unit, but not limited to the like, where the activation unit can facilitate activating the device (100) through a set of sensors (104), where each of the one or more earbuds (102) can include the set of sensors (104) configured to sense the ear surface associated with the entity, and correspondingly generate a second set of signals. The second set of signals can be transmitted to the activation unit, where the activation unit can enable activating the device (100) in response to the received second set of signals.
[0051] In an illustrative embodiment, the second set of signals can be in electrical form, where the second set of signals can be converted to machine readable form or binary form by the extraction unit. The extraction unit can be configured to extract the force or pressure of the ear surface when the set of sensors (104) are pressed or a force or pressure is exerted on the set of sensors (104) through the ear surface. In another illustrative embodiment, the set of sensors (104) can include any or a combination of piezoelectric sensor, force sensor, pressure sensor, and the like. In another illustrative embodiment, the device (100) can be any or a combination of earpods, earphone, but not limited to the like.
[0052] In an illustrative embodiment, the processing unit (108) can be equipped with inbuilt machine Learning model. In another illustrative embodiment, the scanner (106) can be configured to authenticate ear surface through structure and unique data points matched on the ear surface (which are unique and unidentical among every entity) through method of three dimensional recognition. In yet another illustrative embodiment, the earphone can be equipped with a camera (106) to guide inbuilt robotics and mechanism where the one or more earholes of the entity is present. The earphone (EarPods) through Robotic Auto Mechanics can fit into the one or more earholes of the entity and the processing unit (108) can facilitate adjusting the earphone according to one or more earholes of the entity.
[0053] FIG. 3 illustrates an exemplary view of the proposed ear recognition and authentication device, in accordance with an embodiment of the present disclosure.
[0054] As illustrated in FIG. 3, the device (100) can include one or more earbuds (102), a set of sensors (104), a scanner (106), a processing unit (108), and an alert unit (110). In an illustrative embodiment, device (100) can be any or a combination of earphone, earpods, earpiece, and the like, and scanner (106) can include camera, image scanner, and the like. The camera can be configured to capture one or more images of ear surface associated with an entity to detect various points of the ear. The processing unit (108) can include a memory for storing the one or more images. In another illustrative embodiment, the processing unit (108) can be configured with a machine learning model, where the machine learning model can be trained with a set of ear points. The processing unit (108) can receive the captured one or more images, determine the set of ear points from the captured one or more images, compare the set of ear points with pre-stored ear points, and facilitates opening or closing of the ear tips accordingly, to enable proper fitting of the earpiece in the entity’s ear.
[0055] In an illustrative embodiment, the earpiece can be authenticated for the entity, such that the set of ear points determined by the processing unit (108) for the entity’s ear can be unique identity to activate the earpiece. The entity can place the earpiece in the ears and activate the earpiece by actuating a button, consequently the earpiece can enable the camera to capture the one or more images, and transmit the one or more images to the processing unit (108), where the processing unit (108) can compare the received set of ear points with pre-defined or pre-stored ear points of the entity, and if matched, the earpiece can be authenticated and enables the entity to use the earphone normally.
[0056] In an illustrative embodiment, the device (100) can be activated by long pressing on the device surface to activate the device through set of sensors (104). The camera or the scanner (106) can be configured to scan the ear for recognition after the device (100) is activated. The processing unit (108) can be configured to recognize the set of ear points after the ear is properly scanned. The inbuilt machine learning model can match the unique set of ear points (ear bone and ear shape) for entity’s authentication. In another illustrative embodiment, after the set of ear points are successfully authenticated , the entity can use the device (100), where the processing unit (108) can enable adjusting the device (100) while wearing with help of the machine learning module and the camera. The device (100) can automatically fit inside the one or more earholes to provide best fitting and grip to the ear of the entity.
[0057] FIG. 4 illustrates an exemplary method of proposed ear recognition and authentication device, in accordance with an embodiment of the present disclosure.
[0058] In an embodiment, FIG. 4 illustrates an ear recognition and authentication method (400). The method (400) can include step (402) of scanning, at a processor (108), one or more images of ear surface associated with an entity through a scanner (106), where the scanner (106) can be configured with each of one or more ear buds (102), where the one or more ear buds (102) can be adapted to be worn by the entity, and where after scanning, the scanner (106) can correspondingly generate a first set of signals.
[0059] In an embodiment, the method (400) can include step (404) of identifying , at a processor (108), a set of ear points from the first set of signals, where the first set of signals are generated at step (402).
[0060] In an embodiment, the method (400) can include step (406) of training and testing, at the processor (108), the set of ear points based on the identified set of ear points, where the set of ear points can be identified at the step (404).
[0061] In an embodiment, the method (400) can include step (408) of authenticating , at the processor (108), the identified set of ear points with a dataset, where the dataset can include pre-stored ear points associated with the entity,
[0062] In an embodiment, the method (400) can include step (410) of generating, at the processor (108), a set of verification signals and transmitting the set of verification signals to the one or more ear buds, where the set of verification signals can facilitate adjusting the one or more ear buds (102) inside one or more earholes of the entity in response to the authenticated set of ear points and enables providing grip to the one or more ear buds (102) on the ear surface.
[0063] While the foregoing describes various embodiments of the invention, other and further embodiments of the invention may be devised without departing from the basic scope thereof. The scope of the invention is determined by the claims that follow. The invention is not limited to the described embodiments, versions or examples, which are included to enable a person having ordinary skill in the art to make and use the invention when combined with information and knowledge available to the person having ordinary skill in the art.
ADVANTAGES OF THE PRESENT DISCLOSURE
[0064] The present disclosure provides an ear recognition and authentication device with pre-installed ear recognition feature to authenticate and match the ear and hence when matched only then can be used.
[0065] The present disclosure provides an ear recognition and authentication device that facilitates adjusting earpiece according to one or more earholes of entity.
[0066] The present disclosure provides an ear recognition and authentication device where machine learning models along with robotic auto mechanics facilitate fitting the device into earhole of entity
[0067] The present disclosure provides an ear recognition and authentication device with great grip, easy to wear, comfortable and can be used by any age group.
[0068] The present disclosure provides an ear recognition and authentication device that helps in adjusting the device into earholes of the entity without changing shape and size of the device.
| # | Name | Date |
|---|---|---|
| 1 | 202111001231-STATEMENT OF UNDERTAKING (FORM 3) [11-01-2021(online)].pdf | 2021-01-11 |
| 2 | 202111001231-POWER OF AUTHORITY [11-01-2021(online)].pdf | 2021-01-11 |
| 3 | 202111001231-FORM FOR STARTUP [11-01-2021(online)].pdf | 2021-01-11 |
| 4 | 202111001231-FORM FOR SMALL ENTITY(FORM-28) [11-01-2021(online)].pdf | 2021-01-11 |
| 5 | 202111001231-FORM 1 [11-01-2021(online)].pdf | 2021-01-11 |
| 6 | 202111001231-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [11-01-2021(online)].pdf | 2021-01-11 |
| 7 | 202111001231-EVIDENCE FOR REGISTRATION UNDER SSI [11-01-2021(online)].pdf | 2021-01-11 |
| 8 | 202111001231-DRAWINGS [11-01-2021(online)].pdf | 2021-01-11 |
| 9 | 202111001231-DECLARATION OF INVENTORSHIP (FORM 5) [11-01-2021(online)].pdf | 2021-01-11 |
| 10 | 202111001231-COMPLETE SPECIFICATION [11-01-2021(online)].pdf | 2021-01-11 |
| 11 | 202111001231-Proof of Right [28-01-2021(online)].pdf | 2021-01-28 |
| 12 | 202111001231-FORM 18 [28-11-2022(online)].pdf | 2022-11-28 |
| 13 | 202111001231-FER.pdf | 2023-03-06 |
| 14 | 202111001231-FER_SER_REPLY [05-09-2023(online)].pdf | 2023-09-05 |
| 15 | 202111001231-DRAWING [05-09-2023(online)].pdf | 2023-09-05 |
| 16 | 202111001231-CORRESPONDENCE [05-09-2023(online)].pdf | 2023-09-05 |
| 17 | 202111001231-COMPLETE SPECIFICATION [05-09-2023(online)].pdf | 2023-09-05 |
| 18 | 202111001231-CLAIMS [05-09-2023(online)].pdf | 2023-09-05 |
| 1 | searchdoc(12)E_06-03-2023.pdf |
| 2 | EarAcousticAuthenticationTechnology_UsingSoundtoIdentifytheDistinctiveShapeoftheEarCanal_NECE_06-03-2023.pdf |