Abstract: A system (100) for real-time monitoring of health of livestock on a farm is presented. The system (100) includes an acquisition subsystem (114) configured to obtain optical images (106), infrared images (110), sensor information, and animal information related to animals (102) on the farm, and ambient information. Further, the system (100) includes a processing subsystem (116) including a monitoring platform (118) configured to retrieve one or more AI models (120), use an AI based data fusion approach to integrate data from the optical images (106) and the infrared images (110), sensor information, animal information, and ambient information via an AI model (120) to generate outcomes (238, 306) that correspond to a health condition, a heat/estrus condition, or a stress condition of the animal (102). The system (100) also includes an interface unit (122, 124) configured to provide, in real-time, the outcomes (238, 306) to facilitate analysis or treatment.
DESC:BACKGROUND
[0001] Embodiments of the present specification relate generally to wellness of livestock, and more particularly to systems and methods for remotely monitoring the condition of livestock to ascertain or predict their health and fertility condition.
[0002] Wellness of the livestock such as a buffalo is typically defined as a state or quality of being in good health and an absence of illness. Also, stress is defined as a state of the body of the buffalo in which a steady state is disturbed as a result of various external and internal stressors, resulting in emotional and/or physical tension. Further, heat or estrus in a female of most livestock is a state of sexual receptiveness during which the female animal is ready to accept a male to mate.
[0003] The ability to efficiently monitor the health of the livestock is of interest to the farming industry to ensure optimal yield, quality, and productivity of animals such as buffaloes on dairy farms or buffalo farms. In addition, timely heat/estrus detection of a livestock is critical to determination of an optimal time for insemination. Failure to accurately detect estrus in the livestock disadvantageously leads to diminished conception rates. Any lapse in the efficient monitoring of the health of the livestock and inefficient and/or ill-timed estrus detection in the livestock lead to significant losses and hence may adversely impact the dairy farm.
[0004] It may be noted that buffaloes on the farm tend to be less mobile as opposed to other livestock such as free moving cows. Also, the buffaloes exhibit different behavior compared to other livestock such as cows while sick or in heat. Also, the buffaloes are known to be silent heat animals. As will be appreciated, silent heat is generally representative of a condition in which while the buffaloes exhibit physiological symptoms of estrus, they do not exhibit typical behavioral symptoms of estrus.
[0005] Generally, farm hands rely on traditional observation techniques to ensure the wellbeing of the animals on the farm. Also, traditional detection of estrus in the livestock such as buffaloes entails visual inspection of the buffaloes by the farm hands. However, the growing number of the animals on the farms makes it impractical for the farm hands to effectively monitor the health of the animals using the traditional observation techniques, hence necessitating the need for additional monitoring.
[0006] Presently, some techniques for monitoring the health and detecting estrus in the livestock entail use of tags, sensors, and transponder units for collecting data from the animals. However, these systems call for invasively attaching the tags and sensors to the animals, which may cause discomfort to the animals. Also, some non-invasively attached sensors may become detached and hence fail to provide data regarding the animals. Further, complex systems are needed to analyze the data collected from these sensors and tags.
[0007] Several other techniques have been developed to aid in the detection of heat in livestock. For example, biochemical means and other devices have been employed for heat detection. However, the efficiency of heat detection via use of these techniques has been found to be lacking, especially in monitoring estrus in buffaloes. By way of example, methods used to monitor the health and detect heat in cows fail to accurately work for buffaloes due to the difference in manifestation of estrus and/or sickness in buffaloes.
[0008] As will be appreciated, skin temperature is a valuable indicator of the health of livestock. Infrared (IR) technology has been used as a non-contact method to measure skin temperature in livestock. For example, while IR cameras have been used to monitor a buffalo’s skin temperature to observe its health and estrus, these IR cameras fail to accurately measure the skin temperature of the buffalo due to the thick skin of the buffalo. Other techniques for monitoring the health of animals such as the buffalo and estrus detection entail use of a video camera for recording the animals. However, viewing the enormous amount of recorded data to identify a sick buffalo or a buffalo in heat is a laborious task. Also, the view angle of the video camera may not be able to cover all the animals in the herd.
BRIEF DESCRIPTION
[0009] In accordance with yet another aspect of the present specification, a system for real-time monitoring of health of livestock on a farm is presented. The system includes an acquisition subsystem configured to obtain a plurality of optical images corresponding to each animal on the farm, a plurality of infrared images corresponding to each animal on the farm, sensor information related to each animal on the farm, animal information related to each animal on the farm, ambient information related to the farm, or combinations thereof. Further, the system includes a processing subsystem in operative association with the acquisition subsystem and including a monitoring platform, where to monitor in real-time the health of an animal on the farm, the monitoring platform is configured to retrieve one or more artificial intelligence models, use an artificial intelligence based data fusion approach to integrate data from the plurality of optical images corresponding to the animal and the plurality of infrared images corresponding to the animal, the sensor information related to the animal on the farm, the animal information related to the animal on the farm, the ambient information related to the farm, or combinations thereof via an artificial intelligence model to generate one or more outcomes, where the one or more outcomes correspond to a health condition of the animal, a heat/estrus condition of the animal, a stress condition of the animal, or combinations thereof. Moreover, the system includes an interface unit configured to provide, in real-time, the one or more outcomes to facilitate analysis or treatment.
[0010] In accordance with aspects of the present specification, a method for real-time monitoring of health of livestock on a farm is presented. The method includes receiving sensor information related to each animal on the farm, animal information related to each animal on the farm, ambient information related to the farm, or combinations thereof. Moreover, the method includes receiving a plurality of optical images corresponding to each animal on the farm, where the plurality of optical images is obtained via a plurality of optical cameras strategically positioned in the animal farm. Furthermore, the method includes receiving a plurality of infrared images corresponding to each animal, where the plurality of infrared images is obtained via a plurality of infrared cameras strategically positioned in the animal farm. Additionally, the method includes retrieving one or more artificial intelligence models. The method also includes using an artificial intelligence based data fusion approach to integrate data from the plurality of optical images corresponding to each animal and the plurality of infrared images corresponding to each animal, the sensor information related to each animal on the farm, the animal information related to each animal on the farm, the ambient information related to the farm, or combinations thereof via an artificial intelligence model to generate one or more outcomes, where the one or more outcomes correspond to a health condition of the animal, a heat/estrus condition of the animal, a stress condition of the animal, or combinations thereof. Further, the method includes providing the one or more outcomes to facilitate analysis.
[0011] In accordance with another aspect of the present specification, a processing system for real-time monitoring of health of livestock on a farm is presented. The processing system comprises a monitoring platform, where the monitoring platform is configured to retrieve one or more artificial intelligence models, use an artificial intelligence based data fusion approach to integrate data from the plurality of optical images corresponding to the animal and the plurality of infrared images corresponding to the animal, the sensor information related to the animal on the farm, the animal information related to the animal on the farm, the ambient information related to the farm, or combinations thereof via an artificial intelligence model to generate one or more outcomes where the one or more outcomes correspond to a health condition of the animal, a heat/estrus condition of the animal, a stress condition of the animal, or combinations thereof, and provide the one or more outcomes to facilitate analysis.
DRAWINGS
[0012] These and other features and aspects of embodiments of the present specification will become better understood when the following detailed description in read with reference to the accompanying drawings in which like characters represent like parts throughout the drawings, wherein:
[0013] FIG. 1 is a schematic representation of an exemplary system for monitoring health of livestock on a farm, in accordance with aspects of the present specification;
[0014] FIG. 2 is a flow chart illustrating a method for monitoring health of livestock on a farm, in accordance with aspects of the present specification;
[0015] FIG. 3 is a schematic illustration of one embodiment of a method for monitoring health of livestock on a farm of FIG. 2, in accordance with aspects of the present specification;
[0016] FIGs. 4(a)-4(c) are diagrammatical illustrations of providing the monitored health or estrus of livestock on a farm to facilitate analysis and/or treatment recommendations, in accordance with aspects of the present specification; and
[0017] FIG. 5 is a schematic representation of one embodiment of a digital processing system implementing a monitoring platform for use in the system of FIG. 1, in accordance with aspects of the present specification.
DETAILED DESCRIPTION
[0018] The following description presents exemplary systems and methods for monitoring health, stress, and estrus of livestock such as one or more buffaloes. Particularly, embodiments described hereinafter present exemplary systems and methods that facilitate enhanced non-invasive remote monitoring and prediction of the wellbeing of buffaloes and heat detection in buffaloes on a farm. Use of the present systems and methods presents significant advantages in reliably monitoring the health of buffaloes to detect any illness or stress in the buffaloes. Additionally, use of the present systems and methods facilitates the accurate prediction or detection of estrus in buffaloes, thereby overcoming the drawbacks of currently available traditional methods of detecting estrus.
[0019] For ease of understanding, the exemplary embodiments of the present systems and methods are described in the context of a health monitoring system configured to accurately monitor the health of a buffalo. However, use of the exemplary embodiments illustrated hereinafter in other systems and applications such as monitoring health and heat in other livestock is also contemplated. An exemplary environment that is suitable for practising various implementations of the present systems and methods is discussed in the following sections with reference to FIG. 1.
[0020] Referring now to the drawings, FIG. 1 illustrates an exemplary system 100 for monitoring health of livestock on a farm such as one or more buffaloes 102. In particular, the system 100 is configured to accurately monitor and predict one or more health-related outcomes of the buffalo 102 based on data related to one or more optical images corresponding to the buffalo 102, one or more infrared (IR) images corresponding to the buffalo 102, sensor information related to the buffalo 102, animal information related to the buffalo 102, ambient information related to a farm, or combinations thereof. Specifically, an AI-based data fusion approach is employed to combine the data related to the one or more optical images and the one or more IR images of the buffalo 102, the sensor information, the animal information, and/or the ambient information to obtain a more accurate prediction of outcomes that are indicative of the health, stress, and heat of the buffalo 102, thereby circumventing shortcomings of the currently available techniques for monitoring the health, heat, and stress in the buffalo 102. The health-related outcomes may correspond to a health condition of the buffalo 102, a heat/estrus condition of the buffalo 102, a stress condition of the buffalo 102, and the like. In some embodiments, these predicted outcomes may be processed and used for recommendations for further analysis, follow-up, and/or treatment planning.
[0021] It may be noted that for ease of illustration, the exemplary system 100 is described with reference to monitoring of the health of one buffalo 102. However, the system 100 may be used to monitor the health of one or more animals on a farm. Also, for ease of explanation, the exemplary system 100 is described with reference to monitoring of the health of one buffalo 102 via use of one or more optical images and one or more IR images along with other information. However, use of videos, movies, and other types of audio and visual media by the system 100 to monitor the health of one or more buffaloes in a farm is also anticipated.
[0022] As used herein, the term “livestock” is used to refer to domesticated animals that are raised for food, labor, or money. For example, livestock includes primarily cattle, sheep, pigs, goats, horses, donkeys, mules, buffalo, oxen, llamas, camels, and the like. Also, as used herein, the term “segmentation” is used to refer to the detection or identification of one or more anatomical regions of interest in optical images or IR images of the buffalo 102. Further, as used herein, the terms “activity,” “event,” or “events” are used to refer to movements or actions of the buffalo 102. Some non-limiting examples of the buffalo activity include rapid movement, fluid discharges, changes in color features (for example, in the eyes), swelling of the vulva, clear transparent mucus discharge, spontaneous milk let down, bellowing, restlessness, nervousness, frequency of urination, licking, arching of back, sniffing, head lift up, lip curling, and the like.
[0023] In accordance with aspects of the present specification, information regarding each buffalo 102 on the farm may be provided to the system 100. By way of example, the information related to the buffalo 102 may include sensor information and animal information. Additionally, the information may also include ambient information and time stamp information. The sensor information may be acquired via one or more sensors 105 disposed on or about the buffalo 102. In one example, the sensor information related to the buffalo 102 acquired from the one or more sensors 105 may include pulse rate, heart rate variability, menstrual cycle, current location, and the like. Also, in certain embodiments, a virtual fence (not shown in FIG. 1) may also be used to ascertain a current location of the buffalo 102.
[0024] Further, the animal information may include information related to the buffalo 102 such as, but not limited to, an age, sex, identification (ID) number, height, weight, and other such parameters corresponding to each buffalo 102. In one embodiment, the animal information related to the buffalo 102 may be manually entered. Additionally, the ambient information may include ambient temperature and humidity recorded on the farm. Furthermore, time stamp information may include a time associated with the acquisition of each optical image and IR image, the sensor information, and/or the animal information. The information which includes the sensor information, animal information, ambient information, time stamp information, or combinations thereof may be stored locally in a local data repository such as data repository 126. Additionally or alternatively, the information regarding the buffalo 102 may be stored remotely. It may be noted that the information for each buffalo 102 is updated to the data repository 126 with a time stamp. Any time a buffalo 102 is moved or any information changes, the changes may be recorded and updated in the data repository 126.
[0025] In a presently contemplated configuration, the system 100 includes one or more optical cameras 104 and one or more infrared (IR) cameras 108 strategically positioned on the farm. The optical cameras 104 are configured to obtain one or more optical images 106 of the buffalo 102. Similarly, the IR cameras 108 are configured to obtain one or more IR images 110 of the buffalo 102. Further, strategically positioning the optical cameras 104 and the IR cameras 108 entails locating the optical cameras 104 and the IR cameras 108 so as to provide optimal coverage of all the buffaloes on the farm. The number of optical cameras 104 and/or the IR cameras 108 employed may be determined so as to ensure full coverage of the farm and provide adequate access to the anatomies or anatomical regions of interest in the buffaloes on the farm. It may be noted that for ease of illustration, only one optical camera 104 and one IR camera 108 are depicted in FIG. 1.
[0026] Optical images 106 acquired during the day or in conditions with adequate lighting may be better suited for monitoring buffalo activity and/or gathering information related to the buffalo 102. Hence, in accordance with aspects of the present specification, during the daytime or in adequate lighting conditions, primarily the optical images 106 may be used to detect or track activity or events in the buffalo 102, which in turn may be used to determine the health, heat, and stress in the buffalo 102. Additionally, these optical images 106 may also be primarily used to aid in the segmentation of the IR images 110 to obtain more accurate skin temperature measurements. As used herein, the term segmentation is used to refer to the detection or identification of one or more regions of interest in the optical images 106 or the IR images 110 of the buffalo 102.
[0027] However, optical images 106 captured during the nighttime or with very poor or no lighting may not be suitable for tracking any buffalo activity or for segmenting regions of interest in the IR images 110. Therefore, in accordance with aspects of the present specification, in these poor or no light conditions, primarily the IR images 110 may be used to monitor buffalo activity and/or gathering information related to the buffalo 102. Also, anatomical regions of interest in the IR images 110 may be segmented primarily using the IR images 110 themselves. In accordance with aspects of the present specification, based on the lighting condition during the acquisition of the optical images 106 and the IR images 110, either the optical images 106 or the IR images 110 may be employed to facilitate monitoring the health, heat, and stress of the buffalo 102.
[0028] With continuing reference to FIG. 1, the following paragraphs describe the method of monitoring the health, heat, and stress in the buffalo 102 based on images acquired during daytime or during times with adequate lighting along with the sensor information, the animal information, the time stamp information, the ambient information, or combinations thereof. It may be noted that during the daytime and/or during times with adequate lighting, the optical images 106 may be primarily used to track activity in the buffaloes 102 on the farm. Additionally, the optical images 106 may be primarily used for segmentation of both the optical images 106 and IR images 110 to identify one or more anatomical regions of interest of the buffalo 102. Also, the IR images 110 may be used to obtain accurate skin temperature measurements of the buffalo 102.
[0029] Once the one or more optical images 106 and the one or more IR images 110 corresponding to the buffalo 102 are respectively acquired via the optical camera 104 and the IR camera 108, these images 106, 110 may be communicated to an exemplary health monitoring system 112. In one embodiment, the images 106, 110 may be wirelessly communicated to the health monitoring system 112. In other embodiments, the images 106, 110 may be transmitted via other means to the health monitoring system 112. Additionally, the sensor information, animal information, ambient information, and time stamp information may also be communicated to the health monitoring system 112.
[0030] In a presently contemplated configuration, the health monitoring system 112 is depicted as including an acquisition subsystem 114 and a processing subsystem 116. The acquisition subsystem 114 is configured to receive the optical images 106 from the optical cameras 104 and the IR images from the IR cameras 108, where the optical images 106 and the IR images 110 correspond to a given buffalo 102. In addition, the acquisition subsystem 114 is also configured to receive the sensor information, animal information, ambient information, and/or time stamp information associated with the buffalo 102. However, in certain other embodiments, the acquisition subsystem 114 may obtain the optical images 106 and/or the IR images 110 and/or the other information from a storage such as the data repository 126, an optical data storage article such as a compact disc (CD), a digital versatile disc (DVD), a Blu-ray disc, and the like.
[0031] Further, the acquisition subsystem 114 is configured to communicate the optical images 106, the IR images 110, the sensor information, animal information, ambient information, and time stamp information corresponding to a given buffalo 102 to the processing subsystem 116. In addition, data corresponding to the optical images 106 and the IR images 110 as respectively measured by the optical cameras 104 and the IR cameras 108 is updated to a local storage such as the data repository 126. It may be noted that the data corresponding to the optical images 106 and the IR images 110 associated with each buffalo 102 may be stored in the data repository 126 with a time stamp. Furthermore, the sensor information, animal information, ambient information, and time stamp information corresponding to each buffalo 102 may also be stored in the data repository 126 along with the corresponding optical images 106 and/or IR images 110. Any changes in the data related to the buffalo 102 may be updated in the data repository 126.
[0032] In accordance with exemplary aspects of the present specification, the processing subsystem 116 is configured to use AI-based approaches to enhance low-light images/videos, detect different anatomical parts or regions of interest of the buffalo 102, recognize buffalo activity, and obtain accurate skin temperature measurements. The segmented anatomical regions of interest, the identified desired anatomical regions of interest, the detected buffalo activity, and the skin temperature measurements may generally be referred to as data associated with or deduced/derived from the optical images 106 and the IR images 110 of the buffalo 102. Subsequently, the processing subsystem 116 is configured to employ an AI-based data fusion approach to generate a more accurate prediction of health, heat, and stress in the buffaloes 102 based on the data associated with the optical images 106 and the IR images 110, sensor information, animal information, ambient information, and time stamp information.
[0033] In a presently contemplated configuration of FIG. 1, the processing subsystem 116 includes a monitoring platform 118 and one or more artificial intelligence (AI) models 120. The monitoring platform 118 is configured to use AI-based approaches in conjunction with the AI models 120 to process the data in the optical images 106 and the IR images 110 in combination with the sensor information, animal information, ambient information, and time stamp information to generate one or more outcomes that are representative of a more accurate prediction of health, heat, and stress in the buffaloes 102. It may be noted that the terms AI model and model may be used interchangeably.
[0034] In one example, each AI model 120 may include a neural network such as a CNN, a deep neural network, and the like. Further, in one embodiment, the model 120 may include a neural network that is trained to perform a specific AI-based task or function. In particular, the AI model 120 is configured to receive as input a set of parameters and provide as output a desired outcome. One example of a model 120 includes a low-light enhancement model that is configured to receive as input the optical images 106 and/or the IR images 110 and provide as output an AI-based enhancement of the lighting of the optical images 106 and/or the IR images 110. Another example of a model 120 includes a segmentation model that is configured to facilitate an AI-based identification of various anatomical regions of interest of the buffalo 102 in the optical images 106 and/or IR images 110. Yet another example of a model 120 includes an activity detection model that is configured to receive as input the optical images 106 and/or the IR images 110 and provide as output an AI-based tracking or identification of any activity in the anatomical regions of interest of the buffalo 102. One more example of a model 120 includes a data fusion model that is configured to provide an AI-based approach to fuse or integrate data from the optical images 106 and IR images 110, sensor information, animal information, ambient information, time stamp information, and the like to generate as output a desired clinical outcome that is representative of the health, heat, or stress condition of the buffalo 102. The generation of these models 120 will be described later.
[0035] The processing subsystem 116 may include one or more application-specific processors, digital signal processors, microcomputers, graphical processing units, microcontrollers, Application Specific Integrated Circuits (ASICs), Programmable Logic Arrays (PLAs), Field Programmable Gate Arrays (FGPAs), and/or any other suitable processing devices. In some embodiments, the processing subsystem 116 may alternatively be configured to retrieve the information related to the optical images 106 and the IR images 110 from the data repository 126. The data repository 126 may include a hard disk drive, a floppy disk drive, a read/write CD, a DVD, a Blu-ray disc, a flash drive, a solid-state storage device, a local database, and the like.
[0036] In addition, the examples, demonstrations, and/or process steps performed by certain components of the system 100 such as the processing subsystem 116 may be implemented by suitable code on a processor-based system, where the processor-based system may include a general-purpose computer or a special-purpose computer. Also, different implementations of the present specification may perform some or all of the steps described herein in different orders or substantially concurrently.
[0037] As will be appreciated, some of the optical images 106 and/or the IR images 110 may be obtained in low light conditions. It is desirable to enhance the lighting in such low-light images. In some embodiments, the time stamp associated with the optical images 106 and/or the IR images 110 may be used to determine the time of day corresponding to the acquisition of these images 106, 110. Accordingly, the time stamp may be used to determine if the images 106, 110 are acquired during the daytime/adequate lighting conditions or nighttime/low-light or no light conditions. If it is determined that the images 106, 110 are acquired under low-light conditions, the monitoring platform 118 is configured to enhance the low-light optical images 106 and/or the IR images 110. In accordance with aspects of the present specification, the monitoring platform 118 is configured to process the optical images 106 and the IR images 110 using an AI-based technique to facilitate the low-light enhancement. In one example, the AI-based approach for low-light enhancement may entail use of a convolution neural network (CNN) and the like. In one example, an AI model such as a low-light enhancement model 120 may be used to provide an AI-based approach for low-light enhancement of the images 106, 110. The enhanced optical images 106 and the IR images 110 may be used to monitor the health, heat, and stress of the buffalo 102. However, in certain other embodiments, the optical cameras 104 and/or the IR cameras 108 may be configured to facilitate the low-light enhancement of the optical images 106 and IR images 110 respectively prior to communicating the images 106, 110 to the acquisition subsystem 114.
[0038] Subsequent to any low-light enhancement of the optical images 106 and/or IR images 110, the monitoring platform 118 is configured to monitor the health of the buffalo 102 via use of the optical images 106 in conjunction with the sensor information, animal information, ambient information, and/or time stamp information. More particularly, to monitor the health of the buffalo 102 the monitoring platform 118 may be configured to process the optical images 106 of the buffalo 102 to detect activity or events in the buffalo 102. Subsequently, the monitoring platform 118 may be configured to monitor the detected events to determine the health of the buffalo 102. Some examples associated with the health of the buffalo include, stress, fertility status, lameness, diarrhea, parturition, acidosis, ketosis, mastitis, laminitis, and the like. Additionally, the monitoring platform 118 may also be configured to monitor the detected events to detect estrus in the buffalo 102. It may be noted that the terms “activity,” “event,” and “events” may be used interchangeably. As previously noted, some non-limiting examples of the events or activity of the buffalo 102 that are detected and/or monitored include rapid movement, fluid discharges, changes in color features (for example, in the eyes), swelling of the vulva, clear transparent mucus discharge, spontaneous milk let down, bellowing, restlessness, nervousness, frequency of urination, licking, arching of back, sniffing, head lift up, lip curling, and the like.
[0039] Further, as noted hereinabove, the monitoring platform 118 is configured to detect activity or events in the buffalo 102 using the optical images 106. To detect the events in the buffalo 102, it is desirable to monitor one or more anatomical regions in the buffalo 102. Accordingly, the monitoring platform 118 is configured to process the optical images 106 to identify one or more anatomical regions of interest in the buffalo 102. In particular, the monitoring platform 118 is configured to use an AI-based approach to identify or segment one or more anatomical regions of interest of the buffalo 102 in the optical images 106. The AI-based approach is configured to detect objects such as anatomical regions of interest in the optical images 106 in real-time. In one non-limiting example, a convolutional neural network (CNN) and the like may be employed to facilitate the detection or segmentation of the anatomical regions of interest in the optical images 106 in real-time. In one example, an AI model such as a segmentation model 120 may be used to provide an AI-based approach to facilitate the detection or segmentation of the anatomical regions of interest in the optical images 106 in real-time. Also, some non-limiting examples of the segmented anatomical regions of interest include the lips, mouth, nostrils, ears, tail, tongue, vulva, udders, eyes, and the like of the buffalo 102.
[0040] Once the anatomical regions of interest in the optical images 106 are segmented/identified, the monitoring platform 118 is configured to monitor the segmented anatomical regions of interest in the optical images 106 to efficiently detect buffalo activity. In one embodiment, the monitoring platform 118 may be configured to use an AI-based approach to detect or recognize buffalo activity. In one example, the AI-based approach for detecting or recognizing buffalo activity may entail use of a convolution neural network (CNN) and the like. In one example, an AI model such as an activity detection model 120 may be utilized to provide an AI-based approach to detect or recognize buffalo activity in real-time. Specifically, to detect or recognize buffalo activity via the activity detection model 120, the monitoring platform 118 may be configured to examine consecutive optical images 106 or video frames to detect activity in real-time corresponding to the segmented anatomical regions of interest. In one example, the monitoring platform 118 in conjunction with the activity detection model 120 may be configured to use anatomy meshing, anatomy tracking, motion detection, and inferencing to detect activity corresponding to the segmented anatomical regions of interest of the buffalo 102. As previously noted, some non-limiting examples of the buffalo activity include rapid movement, fluid discharges, changes in color features (for example, in the eyes), swelling of the vulva, clear transparent mucus discharge, spontaneous milk let down, bellowing, restlessness, nervousness, frequency of urination, licking, arching of back, sniffing, head lift up, lip curling, and the like. By way of example, the vulva may be monitored to detect any swelling or reddening, where the swelling of the vulva may be indicative of estrus. Similarly, the lips may be monitored to detect any curling, which may be indicative of estrus. In another example, the segmented udders may be monitored to detect any swelling, which in turn may be indicative of inflammation or mastitis in the udders.
[0041] As previously noted, surface or skin temperature in the buffalo 102 is a valuable indicator of a physiological status and/or disease in the buffalo 102. The skin temperature may be indicative of blood circulation, tissue metabolism, and the like. By way of example, an increase in the temperature of the udder of the buffalo 102 may be indicative of an inflammatory response due to mastitis in the udder. Accordingly, any abnormality in thermal patterns based on the skin temperature measurements may be indicative of inflammation or disease in the buffalo 102. Hence, it is essential to accurately measure the skin temperature.
[0042] Currently available techniques for measuring the temperature of the buffalo entail use of temperature sensors for measuring the skin temperature of buffaloes. However, these techniques fail to accurately measure the buffalo skin temperature due to the thickness of the buffalo skin. IR cameras have also been used to monitor a buffalo’s skin temperature to observe the health and heat. However, the thick skin of the buffalo 102 impedes accurate skin temperature measurement. Accordingly, any diagnoses based on the skin temperature measurements using the currently available techniques may be erroneous.
[0043] In accordance with the aspects of the present specification, to monitor the health, heat, and stress in the buffalo 102 based on the exemplary AI-based approach, the monitoring platform 118 is configured to use the IR images 110 for thermal imaging to facilitate accurate skin temperature measurement of the buffalo 102. More particularly, the monitoring platform 118 is configured to acquire skin temperature measurements from anatomical regions of the buffalo 102 that are more suitable for obtaining accurate skin temperature measurements. The more suitable or desired anatomical regions of interest are generally representative of anatomical regions in the buffalo 102 that have relatively thinner skin and hence facilitate more accurate measurement of the skin temperature of the buffalo 102. In one example, the desired anatomical regions of interest in the buffalo 102 may include the udders, the underside of the belly, the inner ear, and the like. These skin temperature measurements may in turn be used to assess the health of the buffalo 102. Some non-limiting examples of diagnoses of the buffalo 102 based on the skin temperature measurements may include estrus, lameness, mastitis, laminitis, diarrhea, stress, and the like.
[0044] To facilitate the accurate measurement of skin temperature of the buffalo 102, the monitoring platform 118 is configured to identify one or more desired anatomical regions of interest in the IR images 110 of the buffalo 102. In particular, the monitoring platform 118 is configured to identify the desired anatomical regions of interest in the IR images 110 using an AI-based approach and based on the segmented anatomical regions of interest in the optical images 106.
[0045] To that end, the monitoring platform 118 is configured to register the one or more IR images 110 and the one or more optical images 106. Subsequently, the monitoring platform 118 is configured to use an AI-based approach to identify corresponding anatomical regions of interest in the IR images 110 of the buffalo 102 based on the segmented anatomical regions of interest in the optical images 106 in real-time. In one example, an AI model such as a segmentation model 120 may be employed to facilitate the identification of corresponding anatomical regions of interest in the IR images 110 of the buffalo 102 based on the segmented anatomical regions of interest in the optical images 106. Once the anatomical regions of interest are segmented in the IR images 110, the AI-based approach used by the monitoring platform 118 is configured to employ an AI model such as a segmentation model 120 to use the segmented anatomical regions of interest in the optical images 106 to also identify one or more desired anatomical regions of interest in IR images 110 in real-time for more accurate thermal imaging. Subsequently, accurate skin temperature measurements of the buffalo 102 may be obtained from the identified desired anatomical regions of interest in the IR images 110. Use of the segmented anatomical regions of interest in the optical images 106 facilitates enhanced identification of corresponding anatomical regions of interest and the desired anatomical regions of interest in the IR images 110. Moreover, use of the desired anatomical regions of interest in the IR images 110 for thermal imaging enables the acquisition of more accurate skin temperature measurements in the buffalo 102.
[0046] It may be noted that while the preceding paragraphs describe the method for monitoring of health, heat, and stress in the buffalo 102 using optical images 106 and IR images 110 obtained during daytime or in adequate lighting conditions, the following paragraphs describe the method for monitoring of health, heat, and stress in the buffalo 102 using optical images 106 and IR images 110 obtained during nighttime or in poor light or no light conditions. As noted hereinabove, it may not be possible to obtain any relevant information from the optical images 106 acquired during poor light or no light conditions. In accordance with aspects of the present specification, during nighttime, poor light or no light conditions, the IR images 110 may be primarily used to aid in the segmentation of anatomical regions of interest in the optical images 106. Additionally, the IR images 110 may also be used to identify desired anatomical regions of interest in the IR images 110. Moreover, any buffalo activity may be tracked primarily using the IR images 110.
[0047] In the poor lighting or no light scenario, subsequent to any low-light enhancement of the optical images 106 and/or IR images 110, the monitoring platform 118 is configured to monitor the health of the buffalo 102 via use of the IR images 108. Accordingly, the monitoring platform 118 is configured to detect activity or events in the buffalo 102 using the IR images 110. In one embodiment, the monitoring platform 118 is configured to use an AI-based approach to identify or segment one or more anatomical regions of interest of the buffalo 102 in the IR images 110. The AI-based approach is configured to detect objects such as anatomical regions of interest in the IR images 110 in real-time. A CNN and the like may be employed to facilitate the detection or segmentation of the anatomical regions of interest in the IR images 110 in real-time. For example, an AI model such as a segmentation model 120 may be used to aid in the detection or segmentation of the anatomical regions of interest in the IR images 110 in real-time.
[0048] Subsequent to the segmentation of the IR images 110, the monitoring platform 118 is configured to monitor the segmented anatomical regions of interest in the IR images 110 for any activity or events such as rapid movement, fluid discharges, changes in color features (for example, in the eyes), swelling of the vulva, clear transparent mucus discharge, spontaneous milk let down, bellowing, restlessness, nervousness, frequency of urination, licking, arching of back, sniffing, head lift up, lip curling, and the like. In one example, an AI-based approach is utilized to detect or recognize buffalo activity in real-time. Accordingly, in one embodiment, the monitoring platform 118 in conjunction with an AI model such as an activity detection model 120 may be configured to examine consecutive IR images 110 or video frames to detect activity corresponding to the segmented anatomical regions of interest. In one embodiment, the monitoring platform 118 along with the activity detection model 120 may be configured to use anatomy meshing, anatomy tracking, motion detection, and inferencing to detect activity corresponding to the segmented anatomical regions of interest of the buffalo 102. For example, segmented udders may be monitored to detect any swelling, which in turn may be indicative of inflammation or mastitis in the udders. Also, the vulva may be monitored to detect any swelling or reddening, where the swelling of the vulva may be indicative of estrus. In another example, the lips may be monitored to detect any curling, which may be indicative of estrus.
[0049] Additionally, to facilitate more accurate skin temperature measurements of the buffalo 102 the monitoring platform 118 is also configured to use an AI-based approach to identify desired anatomical regions of interest having thinner skin from the IR images 110. In one example, an AI model 120 may be used to identify desired anatomical regions of interest having thinner skin from the IR images 110. Subsequently, the monitoring platform 118 is configured to obtain accurate skin temperature measurements of the buffalo 102 from the desired anatomical regions of interest in the IR images 110.
[0050] In further accordance with the exemplary AI-based approach for monitoring the health, heat, and stress in the buffalo 102, the monitoring platform 118 is configured to use the IR images 110 to segment the optical images 106. To that end, the monitoring platform 118 is configured to register the optical images 106 with the IR images 110. Further, the monitoring platform 118 is configured to use an AI-based approach via use of an AI model such as a segmentation model 120 to segment the optical images 106 to identify the anatomical regions of interest based on the segmented anatomical regions of interest in the IR images 110.
[0051] Consequent to the processing of the optical images 106 and the IR images 110 by the monitoring platform 118, the optical images 106 may be used to obtain data related to the buffalo activity, while the IR images 110 may be used to obtain data related to accurate skin temperature measurements of the buffalo 102 and data related to the buffalo activity. By way of example, the data related to the optical images 106 retrieved by the monitoring platform 118 may include the segmented anatomical regions of interest in the buffalo 102, data related to detected activity corresponding to the segmented anatomical regions of interest in the buffalo 102, and the like. Similarly, the data related to the IR images 110 retrieved by the monitoring platform 118 may include the segmented anatomical regions of interest and desired anatomical regions of interest in the buffalo 102, data related to the detected activity, skin temperature measurements obtained from the desired anatomical regions of interest, and the like.
[0052] In accordance with further aspects of the present specification, the processing subsystem 116 is configured to employ an exemplary AI-based data fusion approach to integrate or “fuse” the data related to the optical images 106, the data related to the IR images 110, the sensor information, the animal information, the ambient information, the time stamp information, or combinations thereof to deliver enhanced accuracy in the prediction of health, stress, and heat in the buffalo 102. In one example, the AI-based data fusion approach used by the monitoring platform 118 is configured to employ an AI model such as a data fusion model 120 to integrate the received data to generate one or more outcomes, where the one or more outcomes are representative of a health condition of the buffalo 102, a heat/estrus condition of the buffalo 102, a stress condition of the buffalo 102, or combinations thereof. In particular, the monitoring platform 118 is configured to provide the received data as input to the data fusion model 120 to cause the data fusion model 120 to generate as output the one or more outcomes. Accordingly, the data fusion model 120, when deployed, is configured to aid the monitoring platform 118 in fusing/integrating the data from the optical images 106 and the IR images 110 and the information related to the buffalo 102 to generate outputs in the form of clinical or health-related outcomes that are representative of enhanced estimation or prediction of the health, heat, and stress in the buffalo 102.
[0053] If the clinical outcome generated by the monitoring platform 118 includes a prediction or estimation of the health, heat, and stress corresponding to the buffalo 102, the monitoring platform 118 is also configured to process the predictions/estimations to facilitate further analysis, recommendations, and/or treatment planning. The predicted clinical outcomes and any other relevant information generated by the monitoring platform 118 may be communicated to farm hands and/or veterinarians. The predictions/estimations may be represented as indicators in the form of a graphic, a chart, or any other form of audio, and/or visual representation. The outcomes and/or the indicators may be visualized on an interface unit such as a display.
[0054] In some embodiments, subsequent to the generation of the clinical outcomes, the monitoring platform 118 may also be configured to generate reports that are representative of the health, heat, and stress state of the buffalo 102. Additionally, reports related to any movement and activity of the buffalo 102 may also be generated. These reports may be customized based on the needs of the farm. By way of example, the health/heat/stress reports, movement reports, and any other customized reports may be generated on an hourly basis, daily basis, monthly basis, yearly basis, shift wise basis, or over a custom date range.
[0055] The system 100 that includes the monitoring platform 118 and the AI models 120 provides a robust framework that allows the system 100 to accurately monitor the health, heat, and stress in the buffalo 102. Specifically, the monitoring platform 118 works in conjunction with the models 120 to efficiently combine data from the optical images 106 and the IR images 110 and other information corresponding to the buffalo 102 via an exemplary AI-based data fusion approach to generate clinical outcomes that are representative of enhanced prediction of the health, heat, and stress in the buffalo 102.
[0056] It may be noted that although the embodiment depicted in FIG. 1 depicts the processing subsystem 116 as including the monitoring platform 118, in some embodiments, the monitoring platform 118 may be employed as a standalone unit that is physically separate from the processing subsystem 116 and/or the health monitoring system 112. In certain other embodiments, the monitoring platform 118 may be integrated into the optical cameras 104, the IR cameras 110, or both the optical cameras 104 and the IR cameras 110. Also, in certain embodiments, the models 120 may be located outside the processing subsystem 116 and the system 100.
[0057] With continuing reference to FIG. 1, in one embodiment, the monitoring platform 118 is configured to maintain a model, such as the AI models 120. As used herein, “maintain a model” may entail generating the model 120 and hosting the model 120. The model 120 may be hosted in a data repository 126, the cloud, and the like. Also, the model 120 may be hosted in a local repository, a remote repository, the cloud, and the like. In one example, the model 120 may include a neural network such as a CNN, a deep neural network, and the like.
[0058] Generating an AI model entails “training” a neural network to “learn” a desired task. Accordingly, the neural network is trained with appropriate inputs corresponding to the desired task. In certain embodiments, the monitoring platform 118 may include the neural network and may be configured to generate the AI models.
[0059] In the example of a data fusion model 120, the neural network is configured to receive as input a set of parameters or data deduced or derived from the optical images 106 and the IR images 110 and other buffalo information and provide as output a clinical outcome related to the health/heat/stress of the buffalo 102. In one example, the set of parameters corresponding to the buffalo 102 provided as input to the neural network may include data such as detected activity in the anatomical regions of interest (for example, rapid movement, fluid discharges, changes in color features (for example, in the eyes), swelling of the vulva, clear transparent mucus discharge, spontaneous milk let down, bellowing, restlessness, nervousness, frequency of urination, licking, arching of back, sniffing, head lift up, lip curling, and the like). The set of parameters may also include the skin temperature measurements. Additionally, the set of parameters may include sensor information, animal information, ambient information, and the like.
[0060] Moreover, one or more desired outputs are also provided to the neural network. The desired outputs may include a plurality of sets of clinical outcomes corresponding to the plurality of animals. Some non-limiting examples of the clinical outcomes include a health condition of the animals, a heat condition of the animals, a stress condition of the animals, or combinations thereof. Also, some non-limiting examples of the health of the animals include stress, estrus, fertility status, lameness, diarrhea, parturition, acidosis, ketosis, mastitis, laminitis, or combinations thereof.
[0061] Once the inputs and the desired outputs are provided to the neural network, the neural network may be trained to generate an AI model that is configured to perform a desired task. In one example, the neural network may be trained to generate an AI model that is configured to provide a desired output. In the example of the data fusion model, in response to receipt of the set of parameters as input, the data fusion model 120 is configured to integrate or fuse the received input to generate an output in the form of a predicted clinical outcome that is representative of the health, heat, and stress corresponding to the buffalo 102.
[0062] With continuing reference to maintaining a model, in one example, a neural network such as a CNN or deep neural network (not shown) may be “trained” to generate the model 120 that is configured to facilitate the fusion of data from the optical images 106 and the IR images 110, sensor information, animal information, ambient information, and the like to deliver enhanced accuracy of prediction of health, stress, and heat estimation in the buffalo 102. In particular, the neural network is trained with appropriate inputs corresponding to the data fusion. In one non-limiting example, a plurality of sets of data related to optical images and IR images, sensor information, animal information, ambient information corresponding to a plurality of buffaloes may be provided to the deep neural network as input. Some examples of the data provided as input to the deep neural network include skin temperature measurements, and detected activity corresponding to the anatomical regions of interest in the buffalo 102, segmented anatomical regions of interest in the buffalo 102, and the like. Additionally, the input data provided to the deep neural network may also include sensor information, animal information, and ambient information. Further, one or more desired outputs may also be provided to the deep neural network. The desired outputs may include a plurality of sets of predictions related to health, heat, and stress corresponding to the plurality of buffaloes which may be acquired and provided to the deep neural network. Other indicators of a disease state or illness such as inflammation in the buffaloes may also be provided to the deep neural network.
[0063] Subsequently, the deep neural network may be trained to provide an output in the form of a predicted clinical outcome that is indicative of health, heat, or stress in a buffalo. As will be appreciated, during the training or learning phase of the deep neural network, one or more model parameters in the form of weights of the deep neural network for predicting desired outcomes may be optimized. In particular, the model parameters may be optimized such that loss between the predicted outcomes and the desired outputs is minimized to ensure that the predicted outcomes closely match with the values of desired outputs. Consequent to the training phase of the deep neural network, a model 120 such as a data fusion model that is configured to facilitate an AI based “fusion” of the data related to the optical images 106 and the IR images 110, sensor information, animal information, ambient information to provide an output in the form of a predicted clinical outcome that is indicative of health, heat, or stress in a buffalo 102 is generated.
[0064] Similarly, in another example, a model 120 such as a low-light enhancement model configured to facilitate an AI-based low-light enhancement of the optical images 106 and the IR images 110 may be generated. The low-light enhancement model 120 may include a CNN and the like. In yet another example, a model 120 such as a segmentation model configured to facilitate an AI-based segmentation of the optical images 106 and the IR images 110 may be generated. In this example, the AI-based approach provided by the segmentation model 120 is configured to detect objects such as anatomical regions of interest in the optical images 106 and the IR images 110 in real-time. The segmentation model 120 may include a CNN and the like that is trained and employed to facilitate the detection or segmentation of the anatomical regions of interest in the optical images 106 and the IR images 110 in real-time.
[0065] In a similar fashion, a model 120 such as an activity detection model configured to facilitate an AI-based monitoring of the segmented regions of interest to detect buffalo activity may be generated. In this example, the activity detection model 120 is trained to provide an AI-based approach that is configured to examine consecutive optical and/IR images or video frames to detect activity corresponding to the segmented anatomical regions of interest via use of anatomy meshing, anatomy tracking, motion detection, and inferencing to detect activity corresponding to the segmented anatomical regions of interest of the buffalo 102.
[0066] The models 120 so generated may be stored in the data repository 126. In other embodiments, the models 120 may be transmitted for storage in a remote facility. It may be noted that in some embodiments, the models 120 may be generated offline and stored in the data repository 126, for example.
[0067] With continuing reference to FIG. 1, the health monitoring system 112 may include a display 122 and a user interface 124. The display 122 and the user interface 124 may overlap in some embodiments such as a touch screen. Further, in some embodiments, the display 122 and the user interface 124 may include a common area. The display 122 may be configured to visualize or present the predicted clinical outcomes such as information and/or indicators related to the health, heat, and stress in the buffalo 102. In addition, the optical images 106, the IR images 110, and any other relevant data may also be displayed on the display 122. Moreover, any reports generated by the monitoring platform 118 may be visualized on the display 122. Further, a live status of each buffalo 102 on a farm or multiple farms may be visualized on the display 122.
[0068] The user interface 124 of the health monitoring system 112 may include a human interface device (not shown) that is configured to aid a user such as a veterinarian or a farm hand in providing inputs or manipulating the outcomes and/or indicators visualized on the display 122. The user interface 124 may be used to provide access to authorized personnel to the system 100 and the relevant data. Also, in some embodiments, the user interface may be used to onboard buffalo stock by entering animal information corresponding to each buffalo 102 on the farm. In certain embodiments, the human interface device may include a trackball, a joystick, a stylus, a mouse, or a touch screen. It may be noted that the user interface 124 may be configured to aid the user in navigating through the inputs and/or outcomes/indicators generated by the health monitoring system 112.
[0069] Implementing the health monitoring system 112 that includes the monitoring platform 118 and models 120 as described hereinabove aids in enhancing the performance of the system 100 to accurately monitor and predict the health, heat, and stress states in buffaloes on a farm via use of AI-based approaches, thereby circumventing the shortcomings of traditional monitoring methods. Additionally, the system 100 provides a robust platform in the form of a dashboard that allows secure access only to authorized personnel using identity management and secure sign on. Also, the dashboard is configured to allow the farm hands or authorized personnel to onboard the buffalo stock on the farm as needed. Moreover, the dashboard provides access to the farm hands to view the live status of each buffalo on the farm. In addition, the various customized reports may be viewed via the dashboard. Further, in case of any deviations, the system 100 allows customer defined actions to be performed such as activating the hooter, pushing notifications on a cell phone and infographics on respective dashboards. Moreover, the system 100 provides an “anywhere-anytime” dashboard that is accessible on a mobile phone, a tablet, a laptop, and the like.
[0070] The working of the system 100 may be better understood with reference to FIG. 2, 3, and 4(a)-4(c).
[0071] Embodiments of the exemplary method of FIG. 2may be described in a general context of computer executable instructions on computing systems or a processor. Generally, computer executable instructions may include routines, programs, objects, components, data structures, procedures, modules, functions, and the like that perform particular functions or implement particular abstract data types.
[0072] Moreover, the embodiments of the exemplary method may be practised in a distributed computing environment where optimization functions are performed by remote processing devices that are linked through a wired and/or wireless communication network. In the distributed computing environment, the computer executable instructions may be located in both local and remote computer storage media, including memory storage devices.
[0073] In addition, in FIG. 2, the exemplary method is illustrated as a collection of blocks in a logical flow chart, which represents operations that may be implemented in hardware, software, firmware, or combinations thereof. It may be noted that the various operations are depicted in the blocks to illustrate the functions that are performed. In the context of software, the blocks represent computer instructions that, when executed by one or more processing subsystems, perform the recited operations.
[0074] Further, the order in which the exemplary methods are described is not intended to be construed as a limitation, and any number of the described blocks may be combined in any order to implement the exemplary methods disclosed herein, or equivalent alternative methods. Further, certain blocks may be deleted from the exemplary methods or augmented by additional blocks with added functionality without departing from the spirit and scope of the subject matter described herein.
[0075] Conventional methods for monitoring the health of livestock such as buffaloes include traditional observation techniques via farmhands to ensure the wellbeing of the animals of the livestock, use of tags, sensors, and transponder units for collecting data from the animals. However, these systems fail to facilitate the efficient monitoring of the buffaloes.
[0076] In accordance with aspects of the present specification, the shortcomings of the presently available techniques are circumvented via use of an exemplary AI-based data fusion approach to combine data related to optical images and IR images of a buffalo and other related information to obtain more accurate prediction of health, heat, and stress in the buffaloes.
[0077] Referring to FIG. 2, a flow chart 200 of an exemplary method for monitoring the health of a buffalo, in accordance with aspects of the present specification, is presented. The method 200 of FIG. 2 is described with reference to the components of FIG. 1. Moreover, in certain embodiments, the method 200 may be performed by the monitoring platform 118.
[0078] Turning now to FIG. 2, a flow chart 200 of an exemplary method for monitoring the health of one or more buffaloes 102, in accordance with aspects of the present, is presented. The method 200 is configured to accurately predict one or more health-related/clinical outcomes of the buffalo 102 based on one or more optical images 106 corresponding to the buffalo 102, one or more infrared (IR) images 110 corresponding to the buffalo 102, sensor information related to the buffalo 102, animal information related to the buffalo 102, and ambient information related to a farm, or combinations thereof. The method 200 entails use of AI-based approaches to enhance low-light images/videos, detect different anatomical parts of the buffalo 102, recognize buffalo activity, and obtain accurate skin temperature measurements. Subsequently, the method 200 entails use of an exemplary AI-based data fusion approach to combine data derived/deduced from the one or more optical images 106 and/or the one or more IR images 110, the sensor information, the animal information, and the ambient information to obtain more accurate prediction of outcomes that are indicative of the health, stress, and heat of the buffalo 102, thereby circumventing shortcomings of the currently available techniques for monitoring the health, heat, and stress in the buffalo 102. The health-related outcomes may correspond to a health condition of the buffalo, a heat/estrus condition of the buffalo, a stress condition of the buffalo, and the like. In some embodiments, these predicted outcomes may be processed and used for recommendations for further analysis, follow-up, and/or treatment planning.
[0079] The method starts at step 202, where information regarding each buffalo 102 on the farm may be obtained and communicated to the system 100. Some examples of the buffalo information include sensor information and animal information. Additionally, the information may also include ambient information and time stamp information. As previously noted, the sensor information may be acquired via one or more sensors 105 disposed on or about the buffalo 102. In one example, the sensor information related to the buffalo 102 acquired from one or more sensors 105 may include pulse rate, heart rate variability, menstrual cycle, current location, and the like. A virtual fence may also be used to ascertain a current location of the buffalo 102, in some embodiments. Also, the animal information may include an age, sex, identification number, height, weight, and other such parameters corresponding to each buffalo 102. Additionally, the ambient information related to the ambient temperature and humidity on the farm may also be collected and communicated to the system 100.
[0080] Subsequently, at step 204, one or more optical images 106 corresponding to the buffalo 102 may be obtained via use of one or more optical cameras 104 that have been strategically positioned on the farm. In a similar fashion, at step 206, one or more IR images 110 corresponding to the buffalo 102 may be obtained via use of one or more IR cameras 108 that have been strategically positioned on the farm. The optical images 106 and the IR images 110 may be communicated to the processing subsystem 116 via the acquisition subsystem 114 in the health monitoring system 112. In certain embodiments, the optical images 106 and the IR images 110 along with corresponding time stamps may be stored in the data repository 126. The sensor information, animal information, and ambient information may also be stored in the data repository 126 along with the corresponding optical images 106 and IR images 110.
[0081] According to aspects of the present specification, AI-based approaches may be used to enhance low-light images/videos, detect different anatomical parts of the buffalo 102, recognize buffalo activity, and obtain accurate skin temperature measurements. The segmented anatomical regions of interest, the identified desired anatomical regions of interest, the detected buffalo activity, and the skin temperature measurements may generally be referred to as data associated with or deduced/derived from the optical images 106 and the IR images 110 of the buffalo 102. In addition, an exemplary AI-based data fusion approach may be employed to generate a more accurate prediction of health, heat, and stress in the buffaloes 102. Accordingly, as indicated by step 207, one or more AI models 120 may be retrieved.
[0082] In some examples, as indicated by step 208, AI-based approaches may be used to enhance low-light images/videos such as the optical images 106 and/or the IR images 110 that are obtained in low light conditions or poor light conditions. The low-light enhancement may be performed by the monitoring platform 118. In one embodiment, the AI-based approach for low-light enhancement may entail use of a convolution neural network (CNN) and the like. For example, an AI model such as a low-light enhancement model 120 may be used to facilitate the AI-based low-light enhancement of the optical images 106 and/or the IR images 110. However, in other embodiments, the optical cameras 104 and the IR cameras 108 may be configured to perform the low-light enhancement of the optical images 106 and/or the IR images 110 respectively prior to communicating the images 106, 110 to the acquisition subsystem 114.
[0083] As will be appreciated, the optical images 104 acquired during daytime or in conditions with adequate lighting may be better suited for gathering data related to the buffaloes 102. Also, during very poor lighting conditions or during nighttime, the IR images 110 are more suitable for gathering buffalo related data. Accordingly, at step 210, a check may be carried out to verify if the optical images 106 and the IR images 110 were obtained during the daytime/good lighting condition or during the nighttime/poor lighting conditions. In one example, the time stamp associated with the acquisition of the optical images 106 and IR images 110 may be used to determine if the images 106, 110 were obtained during the daytime or in adequate lighting conditions or during the nighttime or in poor lighting or no lighting conditions.
[0084] At step 210, if it is verified that the images 106, 110 were obtained during the daytime or in adequate lighting conditions, primarily the optical images 106 may be used to detect or track activity or events in the buffalo 102, which in turn may be used to determine the health, heat, and stress in the buffalo 102. Also, these optical images 106 may be primarily used to aid in the segmentation of the IR images 110 to obtain more accurate skin temperature measurements.
[0085] In accordance with aspects of the present specification, to monitor the health of the buffalo 102 the monitoring platform 118 may be configured to process the optical images 106 of the buffalo 102 to detect activity or events in the buffalo 102. Accordingly, as indicated by step 212, to facilitate the detection of buffalo activity, the optical images 106 may be segmented to identify one or more anatomical regions of interest. In some embodiments, the monitoring platform 118 is configured to use an AI-based approach to identify or segment one or more anatomical regions of interest of the buffalo 102 in the optical images 106. As previously noted, an AI-based approach is configured to detect objects such as anatomical regions of interest in the optical images 106 in real-time. In one non-limiting example, a convolutional neural network (CNN) and the like may be employed to facilitate the detection or segmentation of the anatomical regions of interest in the optical images 106 in real-time. In one example, an AI model such as a segmentation model 120 may be used to provide an AI-based approach to facilitate the detection or segmentation of the anatomical regions of interest in the optical images 106 in real-time. Also, some non-limiting examples of the segmented anatomical regions of interest include the lips, mouth, nostrils, ears, tail, tongue, vulva, udders, eyes, and the like of the buffalo 102.
[0086] Subsequently, as indicated by step 214, these segmented anatomical regions of interest in the optical images 106 may be monitored to detect or track any buffalo activity or events, where the detected activity/events may be used to determine the health of the buffalo 102. In one embodiment, the monitoring platform 118 may be configured to use an AI-based approach to detect or recognize buffalo activity. Also, the AI-based approach for detecting or recognizing buffalo activity may entail use of a convolution neural network (CNN) and the like. In one example, an AI model such as an activity detection model 120 may be utilized to provide an AI-based approach to detect or recognize buffalo activity. Further, to detect or recognize buffalo activity via the AI-based approach, the monitoring platform 118 in conjunction with the activity detection model 120 may be configured to examine consecutive optical images or video frames to detect activity corresponding to the segmented anatomical regions of interest. In one example, the monitoring platform 118 in conjunction with the activity detection model 120 may be configured to use anatomy meshing, anatomy tracking, motion detection, and inferencing to detect activity corresponding to the segmented anatomical regions of interest of the buffalo 102. The detected events/activities may include any activity such as rapid movement, fluid discharges, changes in color features (for example, in the eyes), swelling of the vulva, clear transparent mucus discharge, spontaneous milk let down, bellowing, restlessness, nervousness, frequency of urination, licking, arching of back, sniffing, head lift up, lip curling, and the like. For example, the lips may be monitored to detect any curling, which may be indicative of estrus. Also, the vulva may be monitored to detect any swelling or reddening, where the swelling of the vulva may be indicative of estrus. Similarly, the segmented udders may be monitored to detect any swelling, which in turn may be indicative of inflammation or mastitis in the udders.
[0087] Moreover, in accordance with the aspects of the present specification, the IR images 110 are for thermal imaging to facilitate acquisition of skin temperature measurements in the buffalo 102. In particular, skin temperature measurements are acquired from anatomical regions of the buffalo 102 that are more suitable for obtaining accurate skin temperature measurements such as anatomical regions that have relatively thinner skin such as the udders, the underside of the belly, the inner ear, and the like. To that end, at step 216, the one or more IR images 110 are registered with the one or more optical images 106. Subsequently, at step 218, the one or more IR images 110 are segmented to identify one or more anatomical regions of interest based on the segmented optical images 106. In one embodiment, anatomical regions of interest in the IR images 110 are identified in real-time using an AI-based approach and based on the segmented anatomical regions of interest in the optical images 106. A convolutional neural network (CNN) and the like may be employed to facilitate the detection or segmentation of the anatomical regions of interest in the IR images 110 in real-time. In one example, an AI model such as a segmentation model 120 may be employed to aid in the segmentation of the anatomical regions of interest in the IR images 110 in real-time.
[0088] Subsequently, in accordance with aspects of the present specification, one or more desired anatomical regions of interest are identified from the segmented anatomical regions of interest in the IR images 110, as depicted by step 220. It may be noted that anatomical regions of interest in the buffalo 102 that are more suitable for obtaining accurate skin temperature measurements such as anatomical regions that have relatively thinner skin such as the udders, the underside of the belly, the inner ear, and the like may generally be referred to as desired anatomical regions of interest. More particularly, an AI-based approach may be used in conjunction with the segmented anatomical regions of interest in the optical images 106 to identify one or more desired anatomical regions of interest in segmented IR images 110 for more accurate thermal imaging. By way of example, the monitoring platform 118 is configured to employ an AI model 120 to identify one or more desired anatomical regions of interest in IR images based on the segmented anatomical regions of interest 110 for more accurate thermal imaging.
[0089] Further, at step 222, accurate skin temperature measurements corresponding to the buffalo 102 may be obtained from the one or more desired anatomical regions of interest. The skin temperature measurements may in turn be used to assess the health of the buffalo 102.
[0090] With returning reference to step 210, if it is determined that the optical and IR images 106, 110 were obtained during the nighttime or in poor lighting conditions, the optical images 106 taken during the nighttime or with very poor or no lighting may not be suitable for tracking any buffalo activity or for segmenting anatomical regions of interest in the IR images 110. Accordingly, in these poor or no light conditions, primarily the IR images 110 are used to monitor buffalo activity, for segmentation, and for skin temperature measurements.
[0091] Accordingly, at step 224, the one or more IR images 110 are segmented to identify one or more anatomical regions of interest. An AI-based approach is employed to identify or segment one or more anatomical regions of interest of the buffalo 102 in the IR images 110 in real-time. A CNN and the like may be employed to facilitate the detection or segmentation of the anatomical regions of interest in the IR images 110 in real-time. By way of example, an AI model such as a segmentation model 120 may be used to aid in the detection or segmentation of the anatomical regions of interest in the IR images 110 in real-time.
[0092] Furthermore, as depicted by step 226, the identified one or more anatomical regions of interest in the IR images110 are monitored to detect or track buffalo activity. In one embodiment, the monitoring platform 118 may be configured to use an AI-based approach to detect or recognize buffalo activity in real-time. The AI-based approach for detecting or recognizing buffalo activity may entail use of a convolution neural network (CNN) and the like. Also, to detect or recognize buffalo activity via the AI-based approach, the monitoring platform 118 may be configured to employ an AI model such as an activity detection model 120 to examine consecutive IR images or video frames to detect activity corresponding to the segmented anatomical regions of interest. In one example, the monitoring platform 118 in conjunction with the activity detection model 120 may be configured to use anatomy meshing, anatomy tracking, motion detection, and inferencing to detect activity corresponding to the segmented anatomical regions of interest of the buffalo 102.
[0093] Furthermore, as depicted by step 228, one or more desired anatomical regions of interest may be identified from the segmented anatomical regions of interest in the IR images 110. In one example, an AI-based approach may be used to identify desired anatomical regions of interest having thinner skin from the IR images 110 to facilitate more accurate skin temperature measurements of the buffalo. In one example, a model 120 may be used to identify desired anatomical regions of interest having thinner skin from the IR images 110. Skin temperature measurements may be acquired from these desired anatomical regions of interest, as noted by step 230.
[0094] Moreover, in further accordance with the exemplary AI-based approach for monitoring the health, heat, and stress in the buffalo 102, the monitoring platform 118 is configured to use the IR images 110 to segment the optical images 106. To that end, the optical images 106 are registered with the IR images 110, as indicated by step 232. Subsequently, at step 234, an AI-based approach may be used to segment the optical images 106 to identify the anatomical regions of interest based on the segmented anatomical regions of interest in the IR images 110. The AI-based approach may entail use of a CNN and the like. For example, an AI model such as a segmentation model 120 may be used to aid in the detection or segmentation of the anatomical regions of interest in the IR images 110 in real-time, for example. Also, in certain embodiments. the monitoring platform 118 is configured to monitor the segmented anatomical regions of interest in the optical images 106 to detect activity in the various regions of the buffalo 102 via use of an activity detection model 120.
[0095] Consequent to the processing of the optical images 106 and the IR images 110 by the monitoring platform 118 as indicated by steps 202-234, the optical images 106 may be used to obtain data related to detected buffalo activity, while the IR images 110 may be used to obtain data related to accurate skin temperature measurements of the buffalo 102 and data related to detected buffalo activity. Moreover, at step 236, an exemplary AI-based data fusion approach is used to obtain a more accurate prediction of health, stress, and heat of the buffalo 102. In particular, the data related to the optical images 106 and the IR images 110 such as, but not limited to, the detected buffalo activity, the skin temperature measurements, the segmented anatomical regions of interest, and other relevant information such as, but not limited to, the sensor information, animal information, and ambient information may be provided as input to an AI model such as a data fusion model. As previously noted, the data fusion model 120 may include a deep neural network (not shown) that is trained to receive an input, where the received input causes the data fusion model 120 to generate specific desired outputs. In one example, the data fusion model 120, when deployed, is configured to “fuse” or integrate the data provided as input to the data fusion model 120 to generate an output in the form of one or more health-related/clinical outcomes 238 that are representative of enhanced estimation or prediction of the health, heat, and stress in the buffalo 102. These clinical outcomes 238 may be further processed to provide a diagnosis of the buffalo 102, as indicated by step 240. Some non-limiting examples of diagnoses of the buffalo 102 may include estrus, lameness, mastitis, laminitis, diarrhea, stress, parturition, acidosis, ketosis, and the like
[0096] In certain embodiments, subsequent to the generation of the clinical outcomes 238, customized reports that are representative of the health, heat, and stress of the buffalo 102, and movement of the buffalo 102 may be generated. These reports may be customized based on the needs of the farm. By way of example, the health/heat/stress reports, movement reports, and any other customized reports may be generated on an hourly basis, daily basis, monthly basis, yearly basis, shift wise basis, or over a custom date range. These reports may also be communicated to farm personnel to facilitate any planning or treatment options.
[0097] FIG. 3 is a schematic illustration of one embodiment of a method for monitoring health of livestock on a farm of FIG. 2, in accordance with aspects of the present specification. In particular, FIG. 3 is a schematic illustration 300 of one example of a method for generating a clinical outcome that is representative of a health, heat, or stress of the buffalo 102 of FIG. 2.
[0098] The method 300 of FIG. 3 is described with reference to the components of FIGs. 1 and 2. In one embodiment, the method 300 may be performed by the monitoring platform 118 in conjunction with the AI models 120. In the example of FIG. 3, the method 300 entails generating a clinical outcome that is representative of a health, heat, or stress of the buffalo 102 by the monitoring platform 118 based on an AI-based data fusion approach via use of an AI model such as a data fusion model 120. As previously noted, a neural network 302 is trained to generate a model such as the data fusion model 120. The model 302 is configured to accept as input 304 data corresponding to the optical images 106 and the IR images 110. As previously noted, some non-limiting examples of the data corresponding to the optical images 106 may include the segmented anatomical regions of interest in the buffalo 102, detected activity corresponding to the segmented anatomical regions of interest in the buffalo 102, and the like. Similarly, the data related to the IR images 110 may include the segmented anatomical regions of interest and desired anatomical regions of interest in the buffalo 102, the detected activity, skin temperature measurements obtained from the desired anatomical regions of interest, and the like. Further, some examples of the detected activity may include rapid movement, fluid discharges, changes in color features (for example, in the eyes), swelling of the vulva, clear transparent mucus discharge, spontaneous milk let down, bellowing, restlessness, nervousness, frequency of urination, licking, arching of back, sniffing, head lift up, lip curling, and the like.
[0099] Additionally, the sensor information, the animal information, the ambient information, the time stamp information, and the like may also be provided as input to the model 302. As previously noted, the sensor information related to the buffalo 102 may include pulse rate, heart rate variability, menstrual cycle, current location, while the animal information related to the buffalo may include age, sex, height, weight, and other such parameters of the buffalo 102. Also, the ambient information may include the ambient temperature and humidity on the farm. The time stamp information includes time stamps associated with the acquisition of the optical images 106 and infrared images 110, and the like. Subsequent to providing the input 304 to the model 302, the model 302, when deployed, is configured to provide as an output 306 an outcome in the form of an accurate estimation of the health, heat, and stress of the buffalo 102.
[0100] Referring now to FIGs. 4(a)-4(c), diagrammatical representations of some examples of performance of the system 100 for monitoring the health of a buffalo of FIG. 1 and the methods 200, 300 for monitoring the health of a buffalo of FIG. 2 and 3 are presented. In particular, the examples of FIGs. 4(a)-4(c) are diagrammatical illustrations of providing the clinical outcomes to facilitate enhanced monitoring of the health, heat, and stress of the buffalo 102. Also, FIGs. 4(a)-4(c) are described with reference to the components of FIGs. 1, 2(a)-2(c), and 3.
[0101] In the examples depicted in FIGs. 4(a)-4(c), schematic illustrations of a visualization of the predicted clinical outcomes 238, 306 are presented. As previously noted, the health monitoring system 112 and the monitoring platform 118 in conjunction with the models 120 in particular generate the predicted outcomes 238, 306 using an AI-based approach and communicate the predicted outcomes 238, 306 for visualization and/or storage to facilitate further analysis. It may be noted that in the example embodiments of FIGs. 4(a)-4(c), the predicted clinical outcomes 238, 306 are visualized on a handheld device such as a cell phone, a tablet, a laptop, and the like. However, other means of visualization are also anticipated. It may be noted that FIGs. 4(a)-4(c) present examples of the estrus condition of the buffalo 102.
[0102] Referring now to FIG. 4(a), a schematic representation 400 of one example of visualization of an output generated by the system 100 on a cell phone or handheld device 402 is depicted. In the example of FIG. 4(a), the clinical outcome 238 generated is indicative of estrus in the buffalo 102. In the present example, the “quality” or “state” of the estrus is indicated as “Buffalo in heat.” These outputs are represented in the form of a text box 404. Furthermore, one or more parameters such as health, stress, skin temperature measurements, detected buffalo activity are presented as a graphical representation 406. Moreover, the “Buffalo in heat” state may also be represented in the form of an emoticon 408. In this example, the emoticon 408 may have a green color to indicate the “Buffalo in heat” state. Also, an audio indicator 410 of the “Buffalo in heat” state may be visualized.
[0103] FIG. 4(b) illustrates a schematic representation 420 of another example of visualization of an output generated by the system 100. In the example of FIG. 4(b), the clinical outcome 238 generated is indicative of estrus in the buffalo 102 however, the “quality” or “state” of the estrus is designated as “Buffalo will be in heat.” The generated clinical outcome is represented in the form of a text box 422. Additionally, the example of FIG. 4(b) also includes a graphical representation 424 representative of one or more parameters such as health, stress, skin temperature measurements, detected buffalo activity, and the like. Furthermore, the “Buffalo will be in heat” state may also be represented in the form of an emoticon 426. In this example, the emoticon 426 may have a yellow color to indicate the “Buffalo will be in heat” state. An audio indicator 428 of the “Buffalo will be in heat” state may be presented.
[0104] Similarly, FIG. 4(c) depicts a schematic representation 430 of yet another example of visualization of an output generated by the system 100. In the example of FIG. 4(c), the clinical outcome generated is representative of estrus in the buffalo 102. Moreover, the “quality” or “state” of the estrus is designated as “Buffalo not in heat.” The generated clinical outcome is represented in the form of a text box 432. Further, the example of FIG. 4(c) also includes a graphical representation 434 representative of one or more parameters such as health, stress, skin temperature measurements, detected buffalo activity, and the like. Also, the “Buffalo not in heat” state may also be represented in the form of an emoticon 436. In addition, the emoticon 436 may have a red color to indicate the “Buffalo not in heat” state. An audio indicator 438 of the “Buffalo not in heat” state may also be presented.
[0105] The visual representations of the predicted clinical outcomes as depicted in FIGs. 4(a)-c(c) present a convenient snapshot of the “wellness state” of the buffalo 102 to farm hand or a veterinarian, thereby enhancing the clinical workflow in the farm. It may be noted that the various examples of the visual representations presented in FIGs. 4(a)-4(c) are for illustrative purposes. Other designs are also anticipated.
[0106] FIG. 5 is a schematic representation 500 of one embodiment 502 of a digital processing system implementing the monitoring platform 118 (see FIG. 1), in accordance with aspects of the present specification, is depicted. Also, FIG. 5 is described with reference to the components of FIGs. 1, 2(a)-2(c), 3, and 4(a)-4(c).
[0107] It may be noted that while the monitoring platform 118 is shown as being a part of the health monitoring system 112, in certain embodiments, the monitoring platform 118 may also be integrated into end user systems such as, but not limited to, the handheld device 402 such as a cell phone, a tablet, a laptop, and the like (see FIGs. 4(a)-4(c). Moreover, the example of the digital processing system 502 presented in FIG. 5 is for illustrative purposes. Other designs are also anticipated.
[0108] The digital processing system 502 may contain one or more processors such as a central processing unit (CPU) 504, a random access memory (RAM) 506, a secondary memory 508, a graphics controller 510, a display unit 512, a network interface 514, and an input interface 516. It may be noted that the components of the digital processing system 502 except the display unit 512 may communicate with each other over a communication path 518. In certain embodiments, the communication path 518 may include several buses, as is well known in the relevant arts.
[0109] The CPU 504 may execute instructions stored in the RAM 506 to provide several features of the present specification. Moreover, the CPU 504 may include multiple processing units, with each processing unit potentially being designed for a specific task. Alternatively, the CPU 504 may include only a single general-purpose processing unit.
[0110] Furthermore, the RAM 506 may receive instructions from the secondary memory 508 using the communication path 518. Also, in the embodiment of FIG. 5, the RAM 506 is shown as including software instructions constituting a shared operating environment 520 and/or other user programs 522 (such as other applications, DBMS, and the like). In addition to the shared operating environment 520, the RAM 506 may also include other software programs such as device drivers, virtual machines, and the like, which provide a (common) run time environment for execution of other/user programs. Moreover, in certain embodiments, the RAM may also include a model 524. The model 524 may be the model 120 (see FIG. 1).
[0111] With continuing reference to FIG. 5, the graphics controller 510 is configured to generate display signals (e.g., in RGB format) for display on the display unit 512 based on data/instructions received from the CPU 504. The display unit 512 may include a display screen to display images defined by the display signals. Furthermore, the input interface 516 may correspond to a keyboard and a pointing device (e.g., a touchpad, a mouse, and the like) and may be used to provide inputs. In addition, the network interface 514 may be configured to provide connectivity to a network (e.g., using Internet Protocol), and may be used to communicate with other systems connected to a network, for example.
[0112] Moreover, the secondary memory 508 may include a hard drive 526, a flash memory 528, and a removable storage drive 530. The secondary memory 508 may store data generated by the system 100 (see FIG. 1) and software instructions (for example, for implementing the various features of the present specification), which enable the digital processing system 502 to provide several features in accordance with the present specification. The code/instructions stored in the secondary memory 508 may either be copied to the RAM 506 prior to execution by the CPU 504 for higher execution speeds or may be directly executed by the CPU 504.
[0113] Some or all of the data and/or instructions may be provided on a removable storage unit 532, and the data and/or instructions may be read and provided by the removable storage drive 530 to the CPU 504. Further, the removable storage unit 532 may be implemented using medium and storage format compatible with the removable storage drive 530 such that the removable storage drive 530 can read the data and/or instructions. Thus, the removable storage unit 532 includes a computer readable (storage) medium having stored therein computer software and/or data. However, the computer (or machine, in general) readable medium can also be in other forms (e.g., non-removable, random access, and the like.).
[0114] It may be noted that as used herein, the term “computer program product” is used to generally refer to the removable storage unit 532 or a hard disk installed in the hard drive 526. These computer program products are means for providing software to the digital processing system 502. The CPU 504 may retrieve the software instructions and execute the instructions to provide various features of the present specification.
[0115] Also, the term “storage media/medium” as used herein refers to any non-transitory media that store data and/or instructions that cause a machine to operate in a specific fashion. Such storage media may include non-volatile media and/or volatile media. Non-volatile media include, for example, optical disks, magnetic disks, or solid-state drives, such as the secondary memory 508. Volatile media include dynamic memory, such as the RAM 506. Common forms of storage media include, for example, a floppy disk, a flexible disk, hard disk, solid-state drive, magnetic tape, or any other magnetic data storage medium, a CD-ROM, any other optical data storage medium, any physical medium with patterns of holes, a RAM, a PROM, and EPROM, a FLASH-EPROM, NVRAM, any other memory chip or cartridge.
[0116] Storage media is distinct from but may be used in conjunction with transmission media. Transmission media participates in transferring information between storage media. For example, the transmission media may include coaxial cables, copper wire, and fiber optics, including the wires that include the communication path 518. Moreover, the transmission media can also take the form of acoustic or light waves, such as those generated during radio-wave and infra-red data communications.
[0117] Reference throughout this specification to “one embodiment,” “an embodiment,” or similar language means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the present specification. Thus, appearances of the phrases “in one embodiment,” “in an embodiment” and similar language throughout this specification may, but do not necessarily, all refer to the same embodiment.
[0118] Furthermore, the described features, structures, or characteristics of the specification may be combined in any suitable manner in one or more embodiments. In the description presented hereinabove, numerous specific details are provided such as examples of programming, software modules, user selections, network transactions, database queries, database structures, hardware modules, hardware circuits, hardware chips, and the like, to provide a thorough understanding of embodiments of the specification.
[0119] The aforementioned components may be dedicated hardware elements such as circuit boards with digital signal processors or may be software running on a general-purpose computer or processor such as a commercial, off-the-shelf personal computer (PC). The various components may be combined or separated according to various embodiments of the invention.
[0120] Furthermore, the foregoing examples, demonstrations, and process steps such as those that may be performed by the system may be implemented by suitable code on a processor-based system, such as a general-purpose or special-purpose computer. It should also be noted that different implementations of the present specification may perform some or all of the steps described herein in different orders or substantially concurrently, that is, in parallel. Furthermore, the functions may be implemented in a variety of programming languages, including but not limited to C++, Python, and Java. Such code may be stored or adapted for storage on one or more tangible, machine readable media, such as on data repository chips, local or remote hard disks, optical disks (that is, CDs or DVDs), memory or other media, which may be accessed by a processor-based system to execute the stored code. Note that the tangible media may include paper or another suitable medium upon which the instructions are printed. For instance, the instructions may be electronically captured via optical scanning of the paper or other medium, then compiled, interpreted, or otherwise processed in a suitable manner if necessary, and then stored in the data repository or memory.
[0121] Embodiments of the systems and methods for monitoring the health of a buffalo described hereinabove advantageously present a robust framework for predicting desired clinical outcomes such as health, heat, and stress in buffaloes on a farm directly using data from optical images and IR images corresponding to the buffaloes. Advantageously, the system 100 provides an “anywhere-anytime” dashboard that is accessible on a mobile phone, a tablet, a laptop, and the like. The dashboard allows secure access only to authorized personnel using identity management and secure sign on. Further, farm hands can view the live status of each buffalo on the farm using the dashboard. Also, the customized reports related to the buffaloes on the farm may be generated and viewed via the dashboard.
[0122] Additionally, the systems and methods presented herein generate clinical outcomes directly based on the optical images and IR images corresponding to the buffalo, thereby providing significant advantages in reliably predicting the clinical outcomes where traditional methods tend to fail. Also, intelligence inferred from the clinical data to generate the models provides a robust framework for use in predicting the clinical outcomes.
[0123] Although specific features of embodiments of the present specification may be shown in and/or described with respect to some drawings and not in others, this is for convenience only. It is to be understood that the described features, structures, and/or characteristics may be combined and/or used interchangeably in any suitable manner in the various embodiments.
[0124] While only certain features of the present specification have been illustrated and described herein, many modifications and changes will occur to those skilled in the art. It is, therefore, to be understood that the present specification is intended to cover all such modifications and changes as fall within the true spirit of the invention.
,CLAIMS:1. A system (100) for real-time monitoring of health of livestock on a farm, the system (100) comprising:
an acquisition subsystem (114) configured to obtain a plurality of optical images (106) corresponding to each animal (102) on the farm, a plurality of infrared images (110) corresponding to each animal (102) on the farm, sensor information related to each animal (102) on the farm, animal information related to each animal (102) on the farm, ambient information related to the farm, or combinations thereof;
a processing subsystem (116) in operative association with the acquisition subsystem (114) and comprising a monitoring platform (118), wherein to monitor in real-time the health of an animal (102) on the farm, the monitoring platform (118) is configured to:
retrieve one or more artificial intelligence models (120);
use an artificial intelligence based data fusion approach to integrate data from the plurality of optical images (106) corresponding to the animal (102) and the plurality of infrared images (110) corresponding to the animal (102), the sensor information related to the animal (102) on the farm, the animal information related to the animal (102) on the farm, the ambient information related to the farm, or combinations thereof via an artificial intelligence model (120) to generate one or more outcomes (238, 306), wherein the one or more outcomes (238, 306) correspond to a health condition of the animal (102), a heat/estrus condition of the animal (102), a stress condition of the animal (102), or combinations thereof; and
an interface unit (122, 124) configured to provide, in real-time, the one or more outcomes (238, 306) to facilitate analysis or treatment.
2. The system (100) of claim 1, wherein the data from the plurality of optical images (106) corresponding to the animal (102) and the plurality of infrared images (110) corresponding to the animal (102) comprises segmented anatomical regions of interest in the animal (102), activity in one or more anatomical regions of interest in the animal (102), one or more skin temperature measurements corresponding to one or more desired anatomical regions of interest in the animal (102), and wherein the activity in the animal comprises rapid movement, fluid discharges, changes in color features, swelling of the vulva, clear transparent mucus discharge, spontaneous milk let down, bellowing, restlessness, nervousness, frequency of urination, licking, arching of back, sniffing, head lift up, lip curling, or combinations thereof.
3. The system (100) of claim 1, wherein the health of the animal (102) comprises stress, estrus, fertility status, lameness, diarrhea, parturition, acidosis, ketosis, mastitis, laminitis, or combinations thereof, wherein the sensor information comprises pulse rate, heart rate variability, menstrual cycle, and current location corresponding to the animal (102), wherein the ambient information comprises ambient temperature and humidity on the farm, wherein the animal information comprises an age, sex, identification number, height, and weight corresponding to the animal (102) on the farm, and wherein the time stamp information comprises a time stamp associated with acquisition of the plurality of optical images (106) and the plurality of infrared images (110).
4. The system (100) of claim 1, further comprising:
one or more optical cameras (104) strategically positioned across the farm to capture the plurality of optical images (106) corresponding to each animal (102) on the farm; and
one or one or more infrared cameras (108) strategically positioned across the farm to capture the plurality of infrared images (110) corresponding to each animal (102) on the farm,
wherein strategically positioning the one or more optical cameras (104) and the one or more infrared cameras (108) comprises placing the one or more optical cameras (104) and the one or more infrared cameras (108) to provide optimal coverage of all animals (102) on the farm.
5. The system (100) of claim 1, the monitoring platform (118) is configured to perform low-light enhancement of the one or more optical images (106), the one or more infrared images (110), or both via use of an artificial intelligence model (120).
6. The system (100) of claim 1, to monitor in real-time the health of the animal (102) on the farm, the monitoring platform (118) is configured to utilize the plurality of optical images (106), the plurality of infrared images (110), or a combination thereof based on a lighting condition during the acquisition of the plurality of optical images (106) and the plurality of infrared images (110), wherein the lighting condition comprises a good lighting condition and a poor lighting condition.
7. The system (100) of claim 6, if the lighting condition comprises a good lighting condition, the monitoring platform (118) via use of one or more artificial intelligence models (120) is configured to:
segment the plurality of optical images (106) to identify one or more anatomical regions of interest in the animal (102);
monitor the one or more identified anatomical regions of interest in the plurality of optical images (106) to detect or track activity in the one or more identified anatomical regions of interest in the animal (102);
register the plurality of optical images (106) with a corresponding infrared image in the plurality of infrared images (110);
segment the plurality of infrared images (110) to identify one or more anatomical regions of interest based on segmentation of the plurality of optical images (106);
identify one or more desired anatomical regions of interest from the anatomical regions of interest in the plurality of infrared images (110); and
obtain skin temperature measurements of the animal (102) corresponding to the one or more desired anatomical regions of interest in the plurality of infrared images (110).
8. The system (100) of claim 7, if the lighting condition comprises a poor lighting condition, the monitoring platform (118) via use of one or more artificial intelligence models (120) is configured to:
segment the plurality of infrared images (110) to identify one or more anatomical regions of interest in the animal (102);
monitor the one or more identified anatomical regions of interest in the plurality of infrared images (110) to detect or track activity in the one or more identified anatomical regions of interest in the animal (102);
identify one or more desired anatomical regions of interest from the anatomical regions of interest in the plurality of infrared images (110);
obtain skin temperature measurements of the animal (102) corresponding to the one or more desired anatomical regions of interest in the plurality of infrared images (110);
register the plurality of infrared images (110) with a corresponding optical image (101062) in the plurality of optical images (106); and
segment the plurality of optical images (106) to identify one or more anatomical regions of interest based on segmentation of the plurality of infrared images.
9. The system (100) of claim 8, wherein to detect or track activity in the one or more identified anatomical regions of interest in the animal (102) in the plurality of optical images (106), the plurality of infrared images (110), or both the plurality of optical images (106) and the plurality of infrared images (110), the monitoring platform (118) via use of one or more artificial intelligence models (120) is configured to analyze one or more consecutive optical images (106), one or more consecutive infrared images (110), or both to identify activity in the one or more identified anatomical regions of interest.
10. The system (100) of claim 9, wherein the monitoring platform (118) in conjunction with the one or more artificial intelligence models (120) is configured to use anatomy meshing, anatomy tracking, motion detection, inferencing, or combinations thereof to detect activity corresponding to the segmented anatomical regions of interest in the animal (102).
11. The system (100) of claim 9, to use the artificial intelligence based data fusion approach to integrate the data, the monitoring platform (118) is configured to provide as input the data corresponding to the plurality of optical images (106) and the plurality of infrared images (110) corresponding to the animal (102), the detected activity in one or more anatomical regions of interest in the animal (102), the one or more skin temperature measurements corresponding to one or more desired anatomical regions of interest in the animal (102), the segmented anatomical regions of interest, the sensor information related the animal (102), the animal information related to the animal (102), the ambient information related to the farm, or combinations thereof to the artificial intelligence model (120) to cause the artificial intelligence model (120) to provide as an output the one or more outcomes (238, 306).
12. The system (100) of claim 1, wherein the monitoring platform (118) is further configured to generate one or more artificial intelligence models (120), and wherein the one or more artificial intelligence models (120) are tuned for performing the one or more tasks.
13. The system (100) of claim 12, wherein to generate the one or more artificial intelligence models (120) the monitoring platform (118) is configured to:
obtain a plurality of sets of optical images (106) corresponding to a plurality of animals (102);
obtain a plurality of sets of infrared images (110) corresponding to the plurality of animals (102);
receive an input corresponding to sensor information related to the plurality of animals (102), animal information corresponding to the plurality of animals (102), ambient information related to ambient temperature and humidity on the farm, time stamp associated with acquisition of the plurality of optical images (106), the plurality of infrared images (110), or combinations thereof, wherein the sensor information comprises pulse rate, heart rate variability, menstrual cycle, and current location corresponding to the plurality of animals (102), and wherein the animal information comprises an age, sex, identification number, height, and weight corresponding to the plurality of animals (102);
receive an input corresponding to activity in the plurality of animals (102) and skin temperature measurements corresponding to anatomical regions of interest in the plurality of animals (102), wherein the activity comprises rapid movement, fluid discharges, changes in color features, swelling of the vulva, clear transparent mucus discharge, spontaneous milk let down, bellowing, restlessness, nervousness, frequency of urination, licking, arching of back, sniffing, head lift up, lip curling, or combinations thereof;
receive an input corresponding to one or more desired outcomes (238, 306), wherein the one or more desired outcomes (238, 306) correspond to a health condition of the plurality of animals (102), a heat/estrus condition of the plurality of animals (102), a stress condition of the plurality of animals (102);
optimize model parameters of a neural network (302) based on data corresponding to the plurality of optical images (106), the plurality of infrared images (110), information related to the plurality of animals (102), activity in the plurality of animals (102), skin temperature measurements corresponding to anatomical regions of interest in the plurality of animals (102), the one or more desired outcomes (238, 306), or combinations thereof; and
train the neural network (302) to perform the more or more tasks to generate the one or more artificial intelligence models (120).
14. The system (100) of claim 1, wherein the monitoring platform (118) is configured to:
store information related to each animal (102) on the farm in a data repository (126); and
record any changes to the information and update the information in the data repository (126).
15. The system (100) of claim 1, wherein the monitoring platform (118) is further configured to:
generate one or more indicators (404, 406, 408, 410, 422, 424, 426, 428, 432, 434, 436, 438) representative of the one or more outcomes (238, 306), and wherein the one or more indicators (404, 406, 408, 410, 422, 424, 426, 428, 432, 434, 436, 438) provide metrics corresponding to a state of one or more outcomes (238, 306); and
provide the one or more indicators (404, 406, 408, 410, 422, 424, 426, 428, 432, 434, 436, 438) corresponding to the one or more outcomes (238, 306) to facilitate further analysis.
16. A method (200) for real-time monitoring of health of livestock on a farm, the method (200) comprising:
receiving (202) sensor information related to each animal (102) on the farm, animal information related to each animal (102) on the farm, ambient information related to the farm (102), or combinations thereof;
receiving (204) a plurality of optical images (106) corresponding to each animal ((102)) on the farm, wherein the plurality of optical images (106) is obtained via a plurality of optical cameras (104) strategically positioned on the farm;
receiving (206) a plurality of infrared images (110) corresponding to each animal (102), wherein the plurality of infrared images (110) is obtained via a plurality of infrared cameras (108) strategically positioned on the farm;
retrieving (207) one or more artificial intelligence models (120);
using (236) an artificial intelligence based data fusion approach to integrate data from the plurality of optical images (106) corresponding to each animal (102) and the plurality of infrared images (110) corresponding to each animal (102), the sensor information related to each animal (102) on the farm, the animal information related to each animal (102) on the farm, the ambient information related to the farm, or combinations thereof via an artificial intelligence model (120) to generate one or more outcomes (238, 306), wherein the one or more outcomes (238, 306) correspond to a health condition of the animal (102), a heat/estrus condition of the animal (102), a stress condition of the animal (102), or combinations thereof; and
providing (240) the one or more outcomes (238, 306) to facilitate analysis.
17. The method (200) of claim 16, wherein the health of the animal (102) comprises stress, estrus, fertility status, lameness, diarrhea, parturition, acidosis, ketosis, mastitis, laminitis, or combinations thereof, wherein the data from the plurality of optical images (106) corresponding to each animal (102) and the plurality of infrared images (110) corresponding to each animal (102) comprises segmented anatomical regions of interest in each animal (102), activity in one or more anatomical regions of interest in each animal (102), one or more skin temperature measurements corresponding to one or more desired anatomical regions of interest in each animal (102), wherein the sensor information corresponding to each animal (102) on the farm comprises pulse rate, heart rate variability, menstrual cycle, and current location corresponding to each animal (102), wherein the ambient information comprises ambient temperature and humidity in the farm, wherein the animal (102) information comprises an age, sex, identification number, height, and weight corresponding to each animal (102), and wherein the time stamp information comprises time stamp associated with acquisition of the plurality of optical images (106) and the plurality of infrared images (110), and wherein activity in the animal comprises rapid movement, fluid discharges, changes in color features, swelling of the vulva, clear transparent mucus discharge, spontaneous milk let down, bellowing, restlessness, nervousness, frequency of urination, licking, arching of back, sniffing, head lift up, lip curling, or combinations thereof.
18. The method (200) of claim 16, further comprising:
strategically positioning one or more optical cameras (104) across the farm to capture the plurality of optical images (106) corresponding to each animal (102) on the farm; and
strategically positioning one or one or more infrared cameras (108) across the farm to capture the plurality of infrared images (110) corresponding to each animal (102) on the farm,
wherein strategically positioning the one or more optical cameras (104) and the one or more infrared cameras (110) comprises placing the one or more optical cameras (104) and the one or more infrared cameras (110) to provide optimal coverage of all animals (102) on the farm.
19. The method (200) of claim 16, further comprising performing (208) low-light enhancement of one or more optical images (106), one or more infrared images (110), or combinations thereof.
20. The method (200) of claim 16, further comprising utilizing the plurality of optical images (106), the plurality of infrared images (110), or a combination thereof to infer the one or more outcomes (238, 306) based on a lighting condition during the acquisition of the plurality of optical images (106) and the plurality of infrared images (110) via use of one or more artificial intelligence models (120), wherein the lighting condition comprises a good lighting condition and a poor lighting condition.
21. The method (200) of claim 20, if the lighting condition comprises a good lighting condition, the method (200) comprising:
segmenting (212) the plurality of optical images (106) to identify one or more anatomical regions of interest in the animal (102);
monitoring (214) the one or more identified anatomical regions of interest in the plurality of optical images (106) to detect or track activity in the one or more identified anatomical regions of interest in the animal (102);
registering (216) the plurality of optical images (106) with a corresponding infrared image (110) in the plurality of infrared images (110);
segmenting (218) the plurality of infrared images (110) to identify one or more anatomical regions of interest based on segmentation of the plurality of optical images (106);
identifying (220) one or more desired anatomical regions of interest from the anatomical regions of interest in the plurality of infrared images (110); and
obtaining (222) skin temperature measurements of the animal (102) corresponding to the one or more desired anatomical regions of interest in the plurality of infrared images (110).
22. The method (200) of claim 21, if the lighting condition comprises a poor lighting condition, the method (200) comprising:
segmenting (224) the plurality of infrared images (110) to identify one or more anatomical regions of interest in the animal (102);
monitoring (226) the one or more identified anatomical regions of interest in the plurality of infrared images (110) to detect or track activity in the one or more identified anatomical regions of interest in the animal (102);
identifying (228) one or more desired anatomical regions of interest from the anatomical regions of interest in the plurality of infrared images (110);
obtaining (230) skin temperature measurements of the animal (102) corresponding to the one or more desired anatomical regions of interest in the plurality of infrared images (110);
registering (232) the plurality of infrared images (110) with a corresponding optical image (106) in the plurality of optical images (106); and
segmenting (234) the plurality of optical images (106) to identify one or more anatomical regions of interest based on segmentation of the plurality of infrared images (110).
23. The method (200) of claim 22, wherein detecting or tracking activity in the one or more identified anatomical regions of interest in the animal (102) in the plurality of optical images (106), the plurality of infrared images (110), or both the plurality of optical images (106) and the plurality of infrared images (110) comprises analyzing via use of one or more artificial intelligence models (120) one or more consecutive optical images (106), one or more consecutive infrared images (110), or both to identify activity in the one or more identified anatomical regions of interest.
24. The method (200) of claim 23, wherein analyzing via use of the one or more artificial intelligence models (120) one or more consecutive optical images (106), one or more consecutive infrared images (110), or both to identify activity in the one or more identified anatomical regions of interest comprises using anatomy meshing, anatomy tracking, motion detection, inferencing, or combinations thereof to detect activity corresponding to the segmented anatomical regions of interest of the animal (102).
25. The method (200) of claim 22, wherein using (238) the artificial intelligence based data fusion approach to integrate the data comprises providing as input the data corresponding to the plurality of optical images (106) and the plurality of infrared images (110) corresponding to the animal (102), the detected activity in one or more anatomical regions of interest in the animal (102), the one or more skin temperature measurements corresponding to one or more desired anatomical regions of interest in the animal (102), the segmented anatomical regions of interest, the sensor information related the animal (102), the animal information related to the animal (102), the ambient information related to the farm, or combinations thereof to the artificial intelligence model (120) to cause the artificial intelligence model (120) to provide as an output the one or more outcomes (238, 306).
26. The method (200) of claim 16, further comprising generating one or artificial intelligence models (120), wherein the one or more artificial intelligence models (120) are tuned for performing the one or more tasks, and wherein generating the one or more artificial intelligence (120) models comprises:
obtaining a plurality of sets of optical images (106) corresponding to a plurality of animals (102);
obtaining a plurality of sets of infrared images (110) corresponding to the plurality of animals (102);
receiving an input corresponding to sensor information related to the plurality of animals (102), animal information corresponding to the plurality of animals (102), ambient information related to ambient temperature and humidity on the farm, time stamp associated with acquisition of the plurality of optical images (106), the plurality of infrared images (110), or combinations thereof, wherein the sensor information comprises pulse rate, heart rate variability, menstrual cycle, and current location corresponding to the plurality of animals, (102) and wherein the animal information comprises an age, sex, identification number, height, and weight corresponding to the plurality of animals (102);
receiving an input corresponding to activity in the plurality of animals (102)and skin temperature measurements corresponding to anatomical regions of interest in the plurality of animals (102), wherein the activity comprises rapid movement, fluid discharges, changes in color features, swelling of the vulva, clear transparent mucus discharge, spontaneous milk let down, bellowing, restlessness, nervousness, frequency of urination, licking, arching of back, sniffing, head lift up, lip curling, or combinations thereof;
receiving an input corresponding to one or more desired outcomes (238, 306), wherein the one or more desired outcomes (238, 306) correspond to a health condition of the plurality of animals (102), a heat/estrus condition of the plurality of animals (102), a stress condition of the plurality of animals (102);
optimizing model parameters of a neural network (302) based on data corresponding to the plurality of optical images (106), the plurality of infrared images (110), information related to the plurality of animals (102), activity in the plurality of animals (102), skin temperature measurements corresponding to anatomical regions of interest in the plurality of animals (102), the one or more desired outcomes (238, 306), or combinations thereof; and
training the neural network (302) to perform the more or more tasks to generate the one or more artificial intelligence models (120).
27. A processing system (116) for real-time monitoring of health of livestock on a farm, the processing system (116) comprising:
a monitoring platform (118), wherein the monitoring platform (118) is configured to:
retrieve one or more artificial intelligence models (120);
use an artificial intelligence based data fusion approach to integrate data from the plurality of optical images (106) corresponding to the animal (102) and the plurality of infrared images (110) corresponding to the animal (102), the sensor information related to the animal (102) on the farm, the animal information related to the animal (102) on the farm, the ambient information related to the farm, or combinations thereof via an artificial intelligence model (120) to generate one or more outcomes (238, 306), wherein the one or more outcomes (238, 306) correspond to a health condition of the animal (102), a heat/estrus condition of the animal (102), a stress condition of the animal (102), or combinations thereof; and
provide the one or more outcomes (238, 306) to facilitate analysis.
| # | Name | Date |
|---|---|---|
| 1 | 202241060048-PROVISIONAL SPECIFICATION [20-10-2022(online)].pdf | 2022-10-20 |
| 2 | 202241060048-POWER OF AUTHORITY [20-10-2022(online)].pdf | 2022-10-20 |
| 3 | 202241060048-FORM FOR STARTUP [20-10-2022(online)].pdf | 2022-10-20 |
| 4 | 202241060048-FORM FOR SMALL ENTITY(FORM-28) [20-10-2022(online)].pdf | 2022-10-20 |
| 5 | 202241060048-FORM 1 [20-10-2022(online)].pdf | 2022-10-20 |
| 6 | 202241060048-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [20-10-2022(online)].pdf | 2022-10-20 |
| 7 | 202241060048-DRAWINGS [20-10-2022(online)].pdf | 2022-10-20 |
| 8 | 202241060048-DRAWING [07-09-2023(online)].pdf | 2023-09-07 |
| 9 | 202241060048-CORRESPONDENCE-OTHERS [07-09-2023(online)].pdf | 2023-09-07 |
| 10 | 202241060048-COMPLETE SPECIFICATION [07-09-2023(online)].pdf | 2023-09-07 |