Abstract: A system 100 for accommodating and monitoring pets is disclosed. The disclosed system 100 includes a pet house 102 that includes a housing to accommodate a pet. The housing includes at least one actuator 106 configured with the at least one door 104 of the housing. The housing includes a first set of sensors 110-1 for capturing biometric attributes of the pet, a second set of sensors 110-2 for detecting anatomical movement attributes of the pet in the housing, and a third set of sensors 110-3 for detecting one or more environmental parameters in the housing and correspondingly opens the door, or transmits an alert to the one or more mobile computing devices regarding health of the pet and environment in the housing, when at least one of the anatomical movement attributes or at least one of the one or more environmental parameters reaches beyond a predefined limit.
[0001] The present disclosure relates generally to a pet house. More particularly, the present disclosure provides a system for accommodating and monitoring pets.
BACKGROUND
[0002] Background description includes information that may be useful in understanding the present invention. It is not an admission that any of the information provided herein is prior art or relevant to the presently claimed invention, or that any publication specifically or implicitly referenced is prior art.
[0003] With the development of society, people’s living standards are constantly improving. More and more people are starting to raise pets, therefore pet house are in demand nowadays. There are a lot of pet houses, especially dog houses, in the market already. People love their pets so much, consider them as their family members, and want to provide luxury to their pets also. Pet houses provide temporary stay for the pets and prevents the pets from the rain and wind. Also, food and water can be provided to the pets in their houses, and the pets feel safe and secure while sleeping in the house.
[0004] One difficult with the pet houses is that, every time a person should be near the pet house to open or close the door, the door is fully dependent upon the person. But the person cannot be present near the pet house because of other responsibilities such as office work, household work etc. Many cases arise such as the pet is inside the pet house, and the air ventilation cut-off, and due to the locked door the pet cannot come outside by itself, at this time someone should be near the pet house to open the door, because the pets cannot stay in suffocated places. Similarly, sometime the pet is not feeling well, and there is no one in the house to take care of the pet, then the condition of pets can be worse. Also, The people go outstations many times for work, or to enjoy with family, they cannot take the pet with them always, at this time the pet misses the family members and sometime fells ill.
[0005] There is a need to overcome above mentioned problems by bringing solution that facilitates automatically opening and closing of the door of the pet house, along with monitoring the surrounding environment in the pet house. Also, the solution alerts the owner of the pets when unusual behavior is observed in the pet house, and the owner can performs operations accordingly.
OBJECTS OF THE PRESENT DISCLOSURE
[0006] Some of the objects of the present disclosure, which at least one embodiment herein satisfies are as listed herein below.
[0007] It is an object of the present disclosure to provide a system for accommodating and monitoring pets.
[0008] It is an object of the present disclosure to provide a system that enables automatically opening and closing of door of the pet house.
[0009] It is an object of the present disclosure to provide a system that facilitates monitoring behavior of the pets in the pet house remotely.
[0010] It is an object of the present disclosure to provide a system that facilitates monitoring of environment inside the pet house remotely.
[0011] It is an object of the present disclosure to provide a system that facilitates safety of dog by checking the presence of the pet in the pet house.
[0012] It is an object of the present disclosure to provide a system that alerts owner of the pets to feed the pets.
[0013] It is an object of the present disclosure to provide a system that alerts the owner in care of suffocation in the pet house.
[0014] It is an object of the present disclosure to provide a system for accommodating and monitoring the pets, which is highly scalable, affordable, and cost effective.
SUMMARY
[0015] The present disclosure relates generally to a pet house. More particularly, the present disclosure provides a system for accommodating and monitoring pets.
[0016] An aspect of the present disclosure pertains to a system for accommodating and monitoring pets, the system may include a pet house. The pet house may include a housing, the housing may include at least one door, and configured to accommodate the pet, at least one actuator may be configured with the at least one door to enable opening and closing of the at least one door, a first set of sensors may be configured with the housing for capturing biometric attributes of the pet, and correspondingly generate a first set of signals, a second set of sensors may be configured with the housing for detecting anatomical movement attributes of the pet in the housing and correspondingly generate a second set of signals; and a third set of sensors may be configured with the housing for detecting one or more environmental parameters in the housing and correspondingly generate a third set of signals, and a processing unit communicatively coupled with the first set of sensors, the second set of sensors, the third set of sensors, and the at least one actuator.
[0017] In an aspect, the processing unit may include one or more processors coupled with a memory, the memory storing instructions executable by the one or more processors and may be configured to receive the first set of signals, the second set of signals, and the third set of signals, extract the biometric attributes of the pet, the anatomical movement attributes of the pet, and the one or more environmental parameters of the housing, and update a training and testing dataset configured with the processing unit, compare the biometric attributes, the anatomical movement attributes and the one or more environmental parameters with a dataset comprising predefined limits associated with the biometric attributes, the anatomical movement attributes and the one or more environmental parameters, generate a warning signal, in case at least one of the extracted biometric attributes, the anatomical movement attributes and the one or more environmental parameters are beyond the respective predefined limits.
[0018] In an aspect, the first set of sensors may include camera, scanner, nose print sensor, capacitive sensor, and combination thereof, and the biometric attributes can include a nose print of the pet.
[0019] In an aspect, the second set of sensors comprises camera, proximity sensor, motion sensor, accelerometer, infrared sensor, acoustic sensor, and combination thereof, and the anatomical movement attributes of pet comprises pacing, shaking, vomiting, barking, howling, digging, and combination thereof.
[0020] In an aspect, the third set of sensors may include temperature sensor, humidity sensor, oximeter, air quality sensor, and combination thereof, and the one or more environmental parameters comprises temperature, humidity, oxygen level, air quality and combination thereof.
[0021] In an aspect, the at least one door may be movably coupled to the housing at an opening in the housing using one or more hinges, shaft, fitting, joint, and combination thereof that enables the opening and closing of the at least one door.
[0022] In an aspect, the at least one actuator may include pneumatic actuator, electromagnetic actuator, and a combination thereof, operatively coupled to the at least one door.
[0023] In an aspect, the processing unit may be configured to generate an alert signal pertaining to a predefined feeding time of the pet, and transmit the alert signal to the one or more mobile computing devices after the predefined feeding time.
[0024] In an aspect, a unique identification number may be assigned to the pet house that facilitates a user associated with the one or more mobile computing devices to control plurality of pet houses by the one or more mobile computing devices.
[0025] In an aspect, the one or more mobile computing devices may be communicatively coupled with the processing unit, and may be configured to receive the warning signal and the alert signal from the processing unit, and the one or more mobile computing devices may comprise any or a combination of cell phone, laptop, computer, palmtop, I pad, and tablet.
[0026] In an aspect, the system may include a communication unit operatively coupled with the processing unit, configured to communicatively couple the one or more mobile computing devices with the processing unit, and the communication unit may comprises any or a combination of Wireless Fidelity (Wi-Fi) Module, Bluetooth Module, Li-Fi Module, optical fiber, Wireless Local Area Network (WLAN), and ZigBee Module.
BRIEF DESCRIPTION OF THE DRAWINGS
[0027] The accompanying drawings are included to provide a further understanding of the present disclosure, and are incorporated in and constitute a part of this specification. The drawings illustrate exemplary embodiments of the present disclosure and, together with the description, serve to explain the principles of the present disclosure. The diagrams are for illustration only, which thus is not a limitation of the present disclosure.
[0028] In the figures, similar components and/or features may have the same reference label. Further, various components of the same type may be distinguished by following the reference label with a second label that distinguishes among the similar components. If only the first reference label is used in the specification, the description is applicable to any one of the similar components having the same first reference label irrespective of the second reference label.
[0029] FIG. 1 illustrates a block diagram of the proposed system for accommodating and monitoring the pets, in accordance with an embodiment of the present disclosure.
[0030] FIG. 2 illustrates exemplary functional components of the proposed system, in accordance with an embodiment of the present disclosure..
[0031] FIG. 3 illustrates an exemplary view of the opening and closing of the at least one door of the pet house, in accordance with an embodiment of the present disclosure.
DETAIL DESCRIPTION
[0032] The following is a detailed description of embodiments of the disclosure depicted in the accompanying drawings. The embodiments are in such detail as to clearly communicate the disclosure. However, the amount of detail offered is not intended to limit the anticipated variations of embodiments; on the contrary, the intention is to cover all modifications, equivalents, and alternatives falling within the spirit and scope of the present disclosure as defined by the appended claims.
[0033] In the following description, numerous specific details are set forth in order to provide a thorough understanding of embodiments of the present invention. It will be apparent to one skilled in the art that embodiments of the present invention may be practiced without some of these specific details.
[0034] The present disclosure relates generally to a pet house. More particularly, the present disclosure provides a system for accommodating and monitoring pets.
[0035] The present disclosure elaborates a system for accommodating and monitoring pets, the system can include a pet house. The pet house can include a housing, the housing can include at least one door, and configured to accommodate the pet, at least one actuator can be configured with the at least one door to enable opening and closing of the at least one door, a first set of sensors can be configured with the housing for capturing biometric attributes of the pet, and correspondingly generate a first set of signals, a second set of sensors can be configured with the housing for detecting anatomical movement attributes of the pet in the housing and correspondingly generate a second set of signals; and a third set of sensors can be configured with the housing for detecting one or more environmental parameters in the housing and correspondingly generate a third set of signals, and a processing unit communicatively coupled with the first set of sensors, the second set of sensors, the third set of sensors, and the at least one actuator.
[0036] In an embodiment, the processing unit can include one or more processors coupled with a memory, the memory storing instructions executable by the one or more processors and can be configured to receive the first set of signals, the second set of signals, and the third set of signals, extract the biometric attributes of the pet, the anatomical movement attributes of the pet, and the one or more environmental parameters of the housing, and update a training and testing dataset configured with the processing unit, compare the biometric attributes, the anatomical movement attributes and the one or more environmental parameters with a dataset comprising predefined limits associated with the biometric attributes, the anatomical movement attributes and the one or more environmental parameters, generate a warning signal, in case at least one of the extracted biometric attributes, the anatomical movement attributes and the one or more environmental parameters are beyond the respective predefined limits.
[0037] In an embodiment, the first set of sensors can include camera, scanner, nose print sensor, capacitive sensor, and combination thereof, and the biometric attributes can include a nose print of the pet.
[0038] In an embodiment, the second set of sensors comprises camera, proximity sensor, motion sensor, accelerometer, infrared sensor, acoustic sensor, and combination thereof, and the anatomical movement attributes of pet comprises pacing, shaking, vomiting, barking, howling, digging, and combination thereof.
[0039] In an embodiment, the third set of sensors can include temperature sensor, humidity sensor, oximeter, air quality sensor, and combination thereof, and the one or more environmental parameters comprises temperature, humidity, oxygen level, air quality and combination thereof.
[0040] In an embodiment, the at least one door can be movably coupled to the housing at an opening in the housing using one or more hinges, shaft, fitting, joint, and combination thereof that enables the opening and closing of the at least one door.
[0041] In an embodiment, the at least one actuator can include pneumatic actuator, electromagnetic actuator, and a combination thereof, operatively coupled to the at least one door.
[0042] In an embodiment, the processing unit can be configured to generate an alert signal pertaining to a predefined feeding time of the pet, and transmit the alert signal to the one or more mobile computing devices after the predefined feeding time.
[0043] In an embodiment, a unique identification number can be assigned to the pet house that facilitates a user associated with the one or more mobile computing devices to control plurality of pet houses by the one or more mobile computing devices.
[0044] In an embodiment, the one or more mobile computing devices can be communicatively coupled with the processing unit, and can be configured to receive the warning signal and the alert signal from the processing unit, and the one or more mobile computing devices can comprise any or a combination of cell phone, laptop, computer, palmtop, I pad, and tablet.
[0045] In an embodiment, the system can include a communication unit operatively coupled with the processing unit, configured to communicatively couple the one or more mobile computing devices with the processing unit, and the communication unit can comprises any or a combination of Wireless Fidelity (Wi-Fi) Module, Bluetooth Module, Li-Fi Module, optical fiber, Wireless Local Area Network (WLAN), and ZigBee Module.
[0046] FIG. 1 illustrates a block diagram of the proposed system for accommodating and monitoring the pets, in accordance with an embodiment of the present disclosure.
[0047] As illustrated in FIG. 1, the proposed system 100 (also referred to as system 100, hereinafter) for accommodating and monitoring pets, the system can include a pet house 102. The pet house 102 can include a housing (not shown), the housing can be configured to accommodate the pet, and the housing includes at least one door 104 to enable opening and closing of the at least one door. An actuator unit 106, a first set of sensors 110-1, a second set of sensors 110-2, a third set of sensors 110-3, and a processing unit 108 can be configured with the housing. In another illustrative embodiment, the processing unit 108 can be communicatively coupled with a first mobile computing device 116-1 and a server 116-2. In an exemplary embodiment, the processing unit 108 can be configured to a remote location also.
[0048] In an embodiment, the first set of sensors 110-1 can be coupled to the housing for capturing biometric attributes of the pet, and correspondingly generate a first set of signals. The biometric attributes can include nose print of the pet. In an illustrative embodiment the first set of sensors 110-1 can include camera, scanner, nose print sensor, capacitive sensor, and combination thereof. In another illustrative embodiment, the first set of sensors 110-1 can capture the image of the nose of the pet in a pre-defined range of the pet house 102, and convert the image in an electric form as a first set of signals. The first set of signals can be transmitted to the processing unit 108. In an exemplary embodiment, the pre-defined range can be one meter or less that one meter.
[0049] In an embodiment, the second set of sensors 110-2 can be coupled to the housing for detecting anatomical movement attributes of pet in the housing, and correspondingly generate a second set of signals. The anatomical movement attributes of pet can be pacing, shaking, vomiting, barking, howling, digging, and combination thereof. In an illustrative embodiment the second set of sensors 110-2 can include camera, proximity sensor, motion sensor, accelerometer, infrared sensor, acoustic sensor, and combination thereof. In another illustrative embodiment, the second set of sensors 110-2 can detect the abnormal behavior of the pet in the pet house 102 by collecting the anatomical movement attributes of the pet in the housing, and convert the anatomical movement attributes in an electric form as a second set of signals. The second set of signals can be transmitted to the processing unit 108. In an exemplary embodiment, the camera 110-2 can capture the images of the pet and the images can be transmitted to the processing unit 108, to detect the abnormal behavior of the pet form the anatomical movement attributes of the pet.
[0050] In an embodiment, the third set of sensors 110-3 can be coupled the housing for detecting one or more environmental parameters in the housing, and correspondingly generate a third set of signals. The one or more environmental parameters in the housing can be temperature, humidity, oxygen level, air quality and combination thereof. In an illustrative embodiment the third set of sensors 110-3 can include temperature sensor, humidity sensor, oximeter, air quality sensor, and combination thereof. In another illustrative embodiment, the third set of sensors 110-3 can detect the environment in the pet house 102, and convert the one or more environmental parameters in an electric form as a third set of signals. The third set of signals can be transmitted to the processing unit 108. In an exemplary embodiment, the third set of sensors 110-3 can detect the fire inside the pet house 102, or detect the temperature reached beyond a predefined limit inside the pet house 102, and accordingly can transmit the data to the processing unit 108, in the form of third set of signals.
[0051] In an embodiment, at least one actuator 106 can be configured with the at least one door 104 to enable opening and closing of the at least one door 104, and the at least one actuator 106 can comprises pneumatic actuator, electromagnetic actuator, and a combination thereof. In another embodiment, the at least one door 104 can be movably coupled to the housing at an opening in the housing using one or more hinges, shaft, fitting, joint, and combination thereof that enables the opening and closing of the at least one door 104.
[0052] In an embodiment, the processing unit 108 can be configured to receive the first set of electrical signals, the second set of electrical signals, and the third set of signals from the first set of sensors 110-1, the second set of sensors 110-2, and the third set of sensors 110-3 respectively. In an illustrative embodiment, the first processing unit 108 can be microprocessor, microcontroller, Arduino Uno, Atmega 328, and other similar processing unit 108. In yet another illustrative embodiment, the processing unit 108 can be configured to convert the received first set of signals, the second set of signals, and the third set of signals from electrical form to machine readable or binary form with help of sub processing units like extraction unit, comparison unit, updating unit, signal generation unit, and other unit(s) respectively.
[0053] In an embodiment, the processing unit 108 can be configured to extract a fourth set of signals from the first set of signals, the fourth set of signals can pertain to the biometric attributes of the pet, and the biometric attributes can be the nose print. The processing unit 108 can be configured to compare the nose print with a dataset, and the dataset can include plurality of patterns of nose prints of the one or more pets of the user. The processing unit 108 can be configured to generate the warning signal, when the nose print match to at least one of the noise print pattern in the said dataset.
[0054] In an embodiment, the processing unit 108 can be configured to extract a fifth set signals from the second set of signals, the fifth set of signals can pertain to the anatomical movement attributes of the pet in the housing. The processing unit 108 can be configured to compare the anatomical movement attributes of pet with the dataset, and the dataset can include the anatomical movement attributes such that pacing, shaking, vomiting, barking, howling, digging, and combination thereof. The processing unit 108 can be configured to generate the warning signal, when the at least one of the anatomical movement attributes of the pet match to at least one of the anatomical movement attributes in the said dataset. In an exemplary embodiment, when the pet performs any unusual behavior such as pacing, shaking, vomiting, barking, howling, digging, harming itself, which indicates that the pet is not feeling well, the user can be notified by transmitting the warning signal to the user.
[0055] In an embodiment, the processing unit 108 can be configured to extract a sixth set signals from the third set of signals, the sixth set of signals can pertain to the one or more environmental parameters in the pet house 102. The processing unit 108 can be configured to compare the one or more environmental parameters in the pet house 102 with the dataset, and the dataset can include the one or more environmental parameters in the pet house 102 such that temperature, humidity, oxygen level, air quality, and combination thereof. The processing unit 108 can be configured to generate the warning signal, when the at least one of the one or more environmental parameters of the pet house 102 are beyond the predefined limits of the one or more environmental parameters in said dataset. In an exemplary embodiment, when the temperature of the pet house is ten degree, which indicates cooling inside the pet house 102, the user can be notified by transmitting the warning signal to the user.
[0056] In an illustrative embodiment, the warning signal generated by the processing unit 108 can be transmitted to one or more mobile computing devices 116-1 and the server 116-2. The one or more mobile computing devices 116-1 and the server 116-2 can be communicatively coupled with the processing unit 108 through a communication unit 118, such that the communication unit 118 can be operatively coupled with the processing unit 108. In an exemplary embodiment, the one or more mobile computing devices can comprise any or a combination of cell phone, laptop, computer, palmtop, I pad, and tablet, and the server can include one or more processors along with the processing unit 108.
[0057] In an illustrative embodiment, the server 116-2 can be a cloud server, for storing the first set of signals, the second set of signals, and the third set of signals generated by the first set of sensors, the second set of sensors, and the third set of sensors respectively, where, the first set of signals, the second set of signals, and the third set of signals can pertain to the biometric attributes of the pet, the anatomical movement attributes of the pet, and the one or more environmental parameters of the housing. In an exemplary embodiment, the data stored on the server 116-2 can be transmitted to the training and testing dataset configured with the processing unit 108, where the processing unit can learn the biometric attributes of the pet, unusual behavior of the pets, and the intolerable environment inside the housing. In another exemplary embodiment, the data stored in the server 116-2 can be removed after 30-40 days approximately.
[0058] In an embodiment, upon receiving the warning signal from the one or more mobile computing devices 116-1, and the user associated with the one or more mobile computing devices 116-1 can perform one or more operations accordingly. The one or more operations can be open the at least one door fully, half, or just a bit, and close the at least one door, according to the warning signal received from the processing unit 108.
[0059] In an exemplary embodiment, the at least one door 104 can be a hinged door, sliding door, rotating door, panel door, window and the likes. In an exemplary embodiment, the housing can include one or more openings (e.g. windows) that allow for ventilation and/or light, and the windows can be covered with panels, shades, curtains, blinds and the likes.
[0060] FIG. 2 illustrates an exemplary view of the proposed system for accommodating and monitoring the pets, in accordance with an embodiment of the present disclosure.
[0061] As illustrated in an embodiment the system 100 can include sensor(s) 110, the sensor(s) 110 can comprise the first set of sensor 110-1 for collecting biometric attributes of the pet, the second set of sensors 110-2 for collecting the anatomical movement attributes, and the third set of sensors 110-3 for collecting the one or more environmental parameters. In another embodiment, the system 100 can include an actuator(s) 106, said actuator(s) can be operatively coupled to the at least one door to enable opening and closing of the at least one door, and the at least one actuator can include pneumatic actuator, electromagnetic actuator, and a combination thereof. In yet another embodiment, the system 100 can include a communication unit 118, said communication unit can be operatively coupled with the processing unit 108, and can be configured to communicatively couple the one or more mobile computing devices 116-1, and the server 116-2 with the processing unit 108.
[0062] As illustrated in an embodiment, the processing unit 108 can include one or more processor(s). The one or more processor(s) 202 can be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, logic circuitries, and/or any devices that manipulate data based on operational instructions. Among other capabilities, the one or more processor(s) 202 are configured to fetch and execute computer-readable instructions stored in a memory 204 of the processing unit 108. The memory 204 can store one or more computer-readable instructions or routines, which may be fetched and executed to create or share the data units over a network service. The memory 204 can include any non-transitory storage device including, for example, volatile memory such as RAM, or non-volatile memory such as EPROM, flash memory, and the like.
[0063] In an embodiment, the processing unit 108 can also include an interface(s) 206. The interface(s) 206 can include a variety of interfaces, for example, interfaces for data input and output devices, referred to as I/O devices, storage devices, and the like. The interface(s) 206 can facilitate communication of the processing unit 108 with various devices coupled to the processing unit 108. The interface(s) 206 can also provide a communication pathway for one or more components of the processing unit 108. Examples of such components include, but are not limited to, a processing engine(s) 208 and a database 210.
[0064] In an embodiment, the processing engine(s) 208 can be implemented as a combination of hardware and programming (for example, programmable instructions) to implement one or more functionalities of the processing engine(s) 208. In examples described herein, such combinations of hardware and programming may be implemented in several different ways. For example, the programming for the processing engine(s) 208 can be processor executable instructions stored on a non-transitory machine-readable storage medium and the hardware for the processing engine(s) 208 can include a processing resource (for example, one or more processors), to execute such instructions. In the present examples, the machine-readable storage medium can store instructions that, when executed by the processing resource, implement the processing engine(s) 208. In such examples, the processing unit 108 can include the machine-readable storage medium storing the instructions and the processing resource to execute the instructions, or the machine-readable storage medium may be separate but accessible to the processing unit 108 and the processing resource. In other examples, the processing engine(s) 208 can be implemented by electronic circuitry. The database 210 can include data that is either stored or generated as a result of functionalities implemented by any of the components of the processing engine(s) 208.
[0065] In an embodiment, the processing engine(s) 208 can include an extraction unit 212, a comparison unit 214, an updating unit 216, and a signal generation unit 218, and other unit (s) 220. The other unit(s) 220 can implement functionalities that supplement applications or functions performed by the system 100 or the processing engine(s) 208.
[0066] It would be appreciated that units being described are only exemplary units and any other unit or sub-unit may be included as part of the system 100. These units too can be merged or divided into super- units or sub-units as can be configured.
[0067] In an embodiment, the processing unit 108 can be configured to receive the first set of signals, the second set of signals and the third set of signals from the first set of sensors 110-1, the second set of sensors 110-2, and the third set of sensors 110-3 respectively in form of electrical signals. The extraction unit 212 can be configured to extract the fourth set of signals from the first set of signals, the fifth set of signals from the second set of signals, and the sixth set of signals from the third set of signals respectively in machine readable form or binary form. In an embodiment, the fourth set of signals can pertain to the biometric attributes of the pet, the fifth set of signals can pertain to the anatomical movement attributes of the pet in the housing, and the sixth set of signals can pertain to the one or more environmental parameters in the pet house 102. In an exemplary embodiment, the biometric attributes can include nose print of the pet, the anatomical movement attributes of pet can be pacing, shaking, vomiting, barking, howling, digging, and combination thereof, and the one or more environmental parameters can be temperature, humidity, oxygen level, air quality and combination thereof in the housing.
[0068] In an embodiment, the processing unit 108 can be configured to compare the extracted biometric attributes, the anatomical movement attributes, and the one or more environmental parameters with a dataset, where the dataset can include predefined limit ranges with help of the comparison unit 214 and generate a warning signal, when at least one of the extracted biometric attributes, the anatomical movement attributes, and the one or more environmental parameters are beyond the predefined limit ranges with help of the signal generation unit 218.
[0069] In an embodiment, the comparison unit 214 can facilitate in comparing the extracted biometric attributes, the anatomical movement attributes, and the one or more environmental parameters with a dataset stored in the database 210. In another embodiment, the comparison unit 212 can compare the extracted biometric attributes with pre-stored noise prints in database 210. In yet another embodiment, the comparison unit 214 can compare the anatomical movement attributes, and the one or more environmental parameters, and can facilitate in finding whether the extracted anatomical movement attributes, and the one or more environmental parameters, reached the predefined limit ranges.
[0070] In an embodiment, the predefined limit ranges can include threshold values pertaining to the anatomical movement attributes, where the anatomical movement attributes can be pacing, shaking, vomiting, barking, howling, digging, and combination thereof. In an exemplary embodiment, the threshold value can include limit range for time duration of bark of a dog, pitch of the dog bark, and frequency of bark, but not limited to the likes for the barking. In another embodiment, the predefined limit ranges can include threshold values pertaining to the one or more environmental parameters, where the one or more environmental can be temperature, humidity, oxygen level, air quality and combination thereof. In an exemplary embodiment, the threshold value can include limit range of seventy five to seventy eight degrees Fahrenheit but not limited to the likes for temperature parameters in summer, and sixty nine to seventy two degrees Fahrenheit but not limited to the likes for temperature parameters in winter, and the threshold value can include limit range of 0 to 100 but not limited to the likes for air quality index, similarly the threshold value can include different limits for other parameters.
[0071] In an embodiment, the comparison unit 214 can facilitate in comparing the extracted biometric attributes, the anatomical movement attributes, and the one or more environmental parameters with help of a comparator. The comparator can include an analogue comparator or a digital comparator. The digital comparators can compare the extracted biometric attributes, the anatomical movement attributes, and the one or more environmental parameters with help of a comparator with the predefined limit ranges. The digital comparators can facilitate comparison with help of logic gates such as AND, NOT or NOR gates. The digital comparator can be configured to accept the extracted biometric attributes, the anatomical movement attributes, and the one or more environmental parameters in the machine readable form. Further three conditions can be applicable for the comparison of the extracted anatomical movement attributes, and the one or more environmental parameters with the predefined limit ranges.
[0072] In an illustrative embodiment, the three conditions associated with the digital comparator can include a first condition, which can prevail when the extracted anatomical movement attributes, and the one or more environmental parameters are found equal to the predefined limit ranges, a second condition can prevail when the extracted anatomical movement attributes, and the one or more environmental parameters are found beyond the predefined limit ranges, and the third condition can prevail when the extracted anatomical movement attributes, and the one or more environmental parameters are found less than the predefined limit ranges. The digital comparator can compare and transmit the compared the extracted anatomical movement attributes and the one or more environmental parameters to the signal generation unit 216.
[0073] In an illustrative embodiment, only first condition can be applicable for the comparison of the extracted biometric attributes, which can prevail when the extracted biometric attributes are found equal to pre-stored nose print.
[0074] In an embodiment, the processing unit 108 can be configured with machine learning modules and machine learning techniques to update a training and testing dataset based on the received anatomical movement attributes, and the one or more environmental parameters with help of the updating unit 216. The training and testing dataset can communicate to the mobile computing device 116-1 and the server 116-2 with help of the communication unit 118, where the communication unit 118 can include any or a combination of Wireless Fidelity (Wi-Fi), Bluetooth, and Li-Fi, optical fiber, Wireless Local Area Network (WLAN), ZigBee, and the likes.
[0075] In an embodiment, computational intelligence can be applied to the processing unit 108 to automate process of opening and closing the at least one door 104. In another embodiment, the machine learning techniques can include any or a combination of Artificial Neural Network (ANN), Evolutionary Computation, Swarm Intelligence, Artificial Immune System, Fuzzy System. The machine learning modules and the techniques can be configured to create a training and testing dataset based on the second set of signals and the third set of signals received from the second set of sensors 110-2, and the third set of sensors 110-3.
[0076] In an embodiment, the signal generation unit 218 can be configured to receive the compared biometric attributes, the anatomical movement attributes, and the one or more environmental parameters in machine readable form. In an embodiment, the signal generation unit 218 can be configured to generate a warning signal when at least one of the received biometric attributes is found in the dataset, or at least one of the anatomical movements attributes, and at least the one or more environmental parameters are found beyond the predefined limit ranges.
[0077] In an exemplary embodiment, when at least one of the received biometric attributes i.e. nose print of the pet is similar to a nose print already stored in the dataset; the warning signal can be generated. In another exemplary embodiment, when at least one of the one or more attributes such as time duration of bark of a dog is greater than usual, pitch of bark is high, and any combination thereof, the signal generation unit 218 can be configured to generate the warning signal, and the generated warning signal can be transmitted to the one or more mobile computing devices 116-1.
[0078] In an exemplary embodiment, when the at least one of the one or more environmental parameters, such as the temperature parameters are found above the seventy eight degrees Fahrenheit in summer, by the comparison unit 214, the signal generation unit 218 can be configured to generate the warning signal, and the generated warning signal can be transmitted to the one or more mobile computing devices 116-1. In another exemplary embodiment, when the received air quality index value is found above hundred, by the comparison unit 214, the signal generation unit 218 can be configured to generate the warning signal, and the generated warning signal can be transmitted to the one or more mobile computing devices 116-1.
[0079] In an embodiment, the processing unit 108 can transmit the warning signals automatically to the at least one actuator 106, that enable opening and closing of the at least one door. In another embodiment, the processing unit 108 can be configured with the training and testing dataset which can include the anatomical movement attributes, and the one or more environmental parameters of the plurality of the pets. The anatomical movement attributes, and the one or more environmental parameters extracted from the fifth set of signals and the sixth set of signals can be stored in the training and testing dataset, which assists the training and testing dataset to learn more. In an exemplary embodiment, the processing unit can transmit the warning signal automatically to the at least one actuator 106, when at least one of the of the anatomical movement attributes, the one or more environmental parameters are found beyond the predefined limit ranges.
[0080] In an embodiment, the processing unit 108 can generate an alert signal pertaining to a predefined feeding time of the pet, and the alert signal can be transmitted to the one or more mobile computing devices 116-1 after the predefined feeding time. In an exemplary embodiment, the feeding time can be 7 AM and 7 PM, and the feeding time can be changed by the user through the one or more mobile computing devices 116-1, according to the health of the pet. In another exemplary embodiment, upon receiving the alert signal, the user can feed the pet or can inform any other person to feed the pet, if user is not near the pet. In yet another exemplary embodiment, the user can open the at least one door 104 through the one or more mobile computing devices 116-1, and the pet can come out from the pet house 102 to have food.
[0081] In an embodiment, a unique identification number is assigned to the pet house 102 that facilitates the user to control plurality of pet house 102 by the one or more mobile computing devices 116-1. In an exemplary embodiment, the plurality of pet houses 102 can be controlled through at least one of the one or more mobile computing devices 116-1, such that the user can have more than one pet, and each pet can be stayed in a separate pet house. In another exemplary embodiment, the unique identification number can be a combination of number, alphabet, and special characters, and the user can open or close the at least one door 104 of any specific pet house, upon receiving the warning signal and/or alert signal.
[0082] In an embodiment, the processing unit 108 can include a set of voice recordings of the user in the dataset. The user can play at least one of the voice recording upon receiving the warning signal from the processing unit 108. In another embodiment, the processing unit 108 can transmit the set of voice recordings to the training and testing dataset, where the machine learning techniques and modules can be applied to the voice recordings to generate sounds in the user’s voice. The machine learning techniques can include any or a combination of Artificial Neural Network (ANN), evolutionary computation, swarm intelligence, artificial immune system, fuzzy systems, and the likes. In an exemplary embodiment, when the user is not present near the pet, the processing unit 108 can generate sounds in user’s voice automatically, to ensure the pet that the user is near the pet house 102.
[0083] FIG. 3 illustrates an exemplary view of the opening and closing of the at least one door of the pet house, in accordance with an embodiment of the present disclosure.
[0084] In an exemplary embodiment, when the at least one door is opened and the pet is not detected in the pet house 102 by the sensor(s) 110, the processing unit 108 can transmit the warning signals automatically to the at least one actuator 106, configured to the at least one door, for closing the at least one door. In another exemplary embodiment, the processing unit 108 can transmit the warning signals to the cell phone 116-1 associated with the user, and the user can actuate the at least one actuator 106, configured to the at least one door, for closing the at least one door.
[0085] In an exemplary embodiment, when the pet is detected near the at least one door 104, outside the pet house 102, the first set of sensors 110-1 can detect the nose print of the pet, and the detected nose print can be compared with the pre-stored nose prints, through the processing unit 108, and when the detected nose print is found in the dataset, the at least one door 104 can be opened automatically. In another exemplary embodiment, when the pet is detected near the at least one door 104, outside the pet house 102, the first set of sensors 110-1 can detect the nose print of the pet, and the detected nose print can be compared with the pre-stored nose prints, through the processing unit 108, and the user can be informed to open the at least one door 104 by transmitting the warning signal to the cell phone 116-1 associated with the user.
[0086] In an exemplary embodiment, the pet house can be placed in garden area or inside the house easily. The pet house can be coupled with the mobile computing devices 116-1 associated with the persons associated to the house, and the opening and closing of the at least one door 104 can be executed automatically by the processing unit 108 or can be executed by transmitting signals from the mobile computing devices 116-1.
[0087] In an embodiment, one or more switches 120 can be coupled to the housing, the one or more switches 120 can facilitate turning on and off the system. In an exemplary embodiment, turning on and off can be controlled by the same switch, or by the separate switches.
[0088] In an embodiment, the system can include a power source configured to supply electric power to the sensor(s) 110, the processing unit 108, and the actuator(s) 106. In another embodiment, the power source can include any or a combination of rechargeable battery, rechargeable cells, solar cells, solar battery, electrochemical cells, storage battery, and secondary cell.
[0089] Thus, it will be appreciated by those of ordinary skill in the art that the diagrams, schematics, illustrations, and the like represent conceptual views or processes illustrating systems and methods embodying this invention. The functions of the various elements shown in the figures may be provided through the use of dedicated hardware as well as hardware capable of executing associated software. Similarly, any switches shown in the figures are conceptual only. Their function may be carried out through the operation of program logic, through dedicated logic, through the interaction of program control and dedicated logic, or even manually, the particular technique being selectable by the entity implementing this invention. Those of ordinary skill in the art further understand that the exemplary hardware, software, processes, methods, and/or operating systems described herein are for illustrative purposes and, thus, are not intended to be limited to any particular name.
[0090] As used herein, and unless the context dictates otherwise, the term "coupled to" is intended to include both direct coupling (in which two elements that are coupled to each other contact each other) and indirect coupling (in which at least one additional element is located between the two elements). Therefore, the terms "coupled to" and "coupled with" are used synonymously. Within the context of this document terms "coupled to" and "coupled with" are also used euphemistically to mean “communicatively coupled with” over a network, where two or more devices are able to exchange data with each other over the network, possibly via one or more intermediary device.
[0091] It should be apparent to those skilled in the art that many more modifications besides those already described are possible without departing from the inventive concepts herein. The inventive subject matter, therefore, is not to be restricted except in the spirit of the appended claims. Moreover, in interpreting both the specification and the claims, all terms should be interpreted in the broadest possible manner consistent with the context. In particular, the terms “comprises” and “comprising” should be interpreted as referring to elements, components, or steps in a non-exclusive manner, indicating that the referenced elements, components, or steps may be present, or utilized, or combined with other elements, ` components, or steps that are not expressly referenced.
[0092] While the foregoing describes various embodiments of the invention, other and further embodiments of the invention may be devised without departing from the basic scope thereof. The scope of the invention is determined by the claims that follow. The invention is not limited to the described embodiments, versions or examples, which are included to enable a person having ordinary skill in the art to make and use the invention when combined with information and knowledge available to the person having ordinary skill in the art.
ADVANTAGES OF THE PRESENT DISCLOSURE
[0093] The present disclosure provides a system for accommodating and monitoring pets.
[0094] The present disclosure provides a system that enables automatically opening and closing of door of the pet house.
[0095] The present disclosure provides a system that facilitates monitoring behavior of the pets in the pet house remotely.
[0096] The present disclosure provides a system that facilitates monitoring of environment inside the pet house remotely.
[0097] The present disclosure provides a system that facilitates safety of dog by checking the presence of the pet in the pet house.
[0098] The present disclosure provides a system that alerts owner of the pets to feed the pets.
[0099] The present disclosure provides a system that alerts the owner in care of suffocation in the pet house.
[00100] The present disclosure provides a system for accommodating and monitoring the pets, which is highly scalable, affordable, and cost effective.
Claims:1. A system for accommodating and monitoring pets, the system comprising:
a pet house comprising:
a housing comprising at least one door, and configured to accommodate the pet;
at least one actuator configured with the door to enable opening and closing of the at least one door;
a first set of sensors configured with the housing for capturing biometric attributes of the pet, and correspondingly generate a first set of signals;
a second set of sensors configured with the housing for detecting anatomical movement attributes of the pet in the housing and correspondingly generate a second set of signals; and
a third set of sensors configured with the housing for detecting one or more environmental parameters in the housing and correspondingly generate a third set of signals; and
a processing unit communicatively coupled with the first set of sensors, the second set of sensors, the third set of sensors, and the at least one actuator, wherein the processing unit comprises one or more processors coupled with a memory, the memory storing instructions executable by the one or more processors and configured to:
receive the first set of signals, the second set of signals, and the third set of signals;
extract the biometric attributes of the pet, the anatomical movement attributes of the pet, and the one or more environmental parameters of the housing, and update a training and testing dataset configured with the processing unit;
compare the biometric attributes, the anatomical movement attributes and the one or more environmental parameters with a dataset comprising predefined limits associated with the biometric attributes, the anatomical movement attributes and the one or more environmental parameters;
generate a warning signal, in case at least one of the extracted biometric attributes, the anatomical movement attributes and the one or more environmental parameters are beyond the respective predefined limits.
2. The system as claimed in claim 1, wherein the first set of sensors comprises camera, scanner, nose print sensor, capacitive sensor, and combination thereof; and wherein the biometric attributes comprises a nose print.
3. The system as claimed in claim 1, wherein the second set of sensors comprises camera, proximity sensor, motion sensor, accelerometer, infrared sensor, acoustic sensor, and combination thereof; and
wherein the anatomical movement attributes of pet comprises pacing, shaking, vomiting, barking, howling, digging, and combination thereof.
4. The system as claimed in claim 1, wherein the third set of sensors comprises temperature sensor, humidity sensor, oximeter, air quality sensor, and combination thereof; and
wherein the one or more environmental parameters comprises temperature, humidity, oxygen level, air quality and combination thereof.
5. The system as claimed in claim 1, wherein the at least one door is movably coupled to the housing at an opening in the housing using one or more hinges, shaft, fitting, joint, and combination thereof that enables the opening and closing of the at least one door.
6. The system as claimed in claim 5, wherein the at least one actuator comprises pneumatic actuator, electromagnetic actuator, and a combination thereof, operatively coupled to the at least one door.
7. The system as claimed in claim 1, wherein the processing unit is configured to generate an alert signal pertaining to a predefined feeding time of the pet, and transmit the alert signal to the one or more mobile computing devices after the predefined feeding time.
8. The system as claimed in claim 1, wherein a unique identification number is assigned to the pet house that facilitates a user associated with the one or more mobile computing devices to control plurality of pet houses by the one or more mobile computing devices.
9. The system as claimed in claim 6, wherein the one or more mobile computing devices are communicatively coupled with the processing unit, and configured to receive the warning signal and the alert signal from the processing unit, and wherein the one or more mobile computing devices comprise any or a combination of cell phone, laptop, computer, palmtop, I pad, and tablet.
10. The system as claimed in claim 1, wherein the system comprises a communication unit operatively coupled with the processing unit, configured to communicatively couple the one or more mobile computing devices with the processing unit, and wherein the communication unit comprises any or a combination of Wireless Fidelity (Wi-Fi) Module, Bluetooth Module, Li-Fi Module, optical fiber, Wireless Local Area Network (WLAN), and ZigBee module.
| # | Name | Date |
|---|---|---|
| 1 | 202011049180-STATEMENT OF UNDERTAKING (FORM 3) [11-11-2020(online)].pdf | 2020-11-11 |
| 2 | 202011049180-POWER OF AUTHORITY [11-11-2020(online)].pdf | 2020-11-11 |
| 3 | 202011049180-FORM FOR STARTUP [11-11-2020(online)].pdf | 2020-11-11 |
| 4 | 202011049180-FORM FOR SMALL ENTITY(FORM-28) [11-11-2020(online)].pdf | 2020-11-11 |
| 5 | 202011049180-FORM 1 [11-11-2020(online)].pdf | 2020-11-11 |
| 6 | 202011049180-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [11-11-2020(online)].pdf | 2020-11-11 |
| 7 | 202011049180-EVIDENCE FOR REGISTRATION UNDER SSI [11-11-2020(online)].pdf | 2020-11-11 |
| 8 | 202011049180-DRAWINGS [11-11-2020(online)].pdf | 2020-11-11 |
| 9 | 202011049180-DECLARATION OF INVENTORSHIP (FORM 5) [11-11-2020(online)].pdf | 2020-11-11 |
| 10 | 202011049180-COMPLETE SPECIFICATION [11-11-2020(online)].pdf | 2020-11-11 |
| 11 | 202011049180-Proof of Right [27-11-2020(online)].pdf | 2020-11-27 |
| 12 | 202011049180-FORM 18 [12-09-2022(online)].pdf | 2022-09-12 |
| 13 | 202011049180-FER.pdf | 2022-10-10 |
| 14 | 202011049180-FER_SER_REPLY [10-04-2023(online)].pdf | 2023-04-10 |
| 15 | 202011049180-DRAWING [10-04-2023(online)].pdf | 2023-04-10 |
| 16 | 202011049180-CORRESPONDENCE [10-04-2023(online)].pdf | 2023-04-10 |
| 17 | 202011049180-COMPLETE SPECIFICATION [10-04-2023(online)].pdf | 2023-04-10 |
| 18 | 202011049180-CLAIMS [10-04-2023(online)].pdf | 2023-04-10 |
| 19 | 202011049180-US(14)-HearingNotice-(HearingDate-19-07-2024).pdf | 2024-06-26 |
| 20 | 202011049180-FORM-26 [15-07-2024(online)].pdf | 2024-07-15 |
| 21 | 202011049180-Correspondence to notify the Controller [15-07-2024(online)].pdf | 2024-07-15 |
| 22 | 202011049180-Written submissions and relevant documents [02-08-2024(online)].pdf | 2024-08-02 |
| 23 | 202011049180-Annexure [02-08-2024(online)].pdf | 2024-08-02 |
| 24 | 202011049180-PatentCertificate04-02-2025.pdf | 2025-02-04 |
| 25 | 202011049180-IntimationOfGrant04-02-2025.pdf | 2025-02-04 |
| 1 | 202011049180ssE_10-10-2022.pdf |