Abstract: The animal assaults that cause crop damage are one of the main factors lowering crop productivity. Farm crops are frequently destroyed by local wildlife, such as buffalos, cows, goats, and birds. The farmers are not able to guard their fields around-the-clock or barricade entire areas. In addition, they must respond differently to different kinds of creatures that attempt to enter these kinds of forbidden zones. The general techniques used by farmers to stop animals from damaging their crops includes erecting physical barriers, using electric fences, conducting manual surveillance, and using other similar strenuous and hazardous techniques. In our invention, a sowing safety is proposed for sustainable farming that uses a camera to scan and identify stray animals or birds in order to safeguard crops from animals/birds. Once animal/bird is identified, the system emits an animal eradication sound. 4 Claims & 1 Figure
Description:Field of Invention
The invention relates to the use of artificial intelligence-based deep learning algorithms to protect the crops from the wild animals or birds in order to improve the cultivation of the crops from the damages.
Objectives of the Invention
These days, animal attacks in India are a typical occurrence. Their crops are destroyed by these attacks since there is no detecting system in place. These folks are left defenseless against their fate since appropriate safety precautions aren't taken. Additionally, the people' crops are ruined by the animals' constant intrusion. Paddy fields and crops are not always fenced in. Thus, there is a very real chance that goats and cows will devour crops. Farmers' harvests could be greatly wasted as a result of this. This innovation's primary goal is to defend crops from animal attacks.
Background of the Invention
In recent years the purpose of animal detection is to stop or lessen the amount of collisions between animals and cars. These devices are intended specifically to target wild creatures that have the potential to kill, injure, or damage property in humans. Before they cross the road, these wild creatures are detected by this system. Signs alerting people to possible animal crossings have historically been used to prevent animal-vehicle collisions. In some instances, fences or wildlife warning reflectors have been put in place to deter animals from approaching the road. A number of wildlife crossing constructions have been integrated with wildlife fencing in certain regions.
Computers may now do tasks without explicit programming thanks to machine learning. Modern techniques train machines by demonstrating to them the proper pairings of inputs and outputs, or supervised learning. When classifying photos, for instance, the computer is trained on numerous pairings of images and labels, where the image is the input and the output is the appropriate label (such as "buffalo"). Since the aforementioned issue persists despite all attempts to solve it, we took a deep learning strategy to automatically scare off the animals. For this innovation, I've created and completed the necessary pre-processing using programs like Play sound and Kera's. In this case, the CCTV (closed-circuit television) provides the input. The code processes and predicts the frames obtained from the camera, and an appropriate sound repellant is played to eject the animal that has been detected.
When we see images or videos, we can locate and identify the objects of interest with ease. Finding and identifying the object, or object detection, is all that is needed to transfer this wisdom to computers. Applications for object detection can be found in many different fields, including autonomous driving cars, video surveillance, and picture retrieval systems. While there are several techniques available for object detection, we will concentrate on the YoloV3 approach.
A dynamic data storage device that stores a large number of dynamic samples collected in advance as a template group, The livestock management system characterized in that the camera tracks the camera at the same time by performing image recognition processing by collation with the template based on the identification data of the individual obtained by locking on the wireless equipment. It is a livestock management system that improves the productivity of livestock by reducing the time required for breeding of livestock animals, health status, etc (JP2015248834A).
Image recognition system and method capable of recognizing object state and preventing missing detection: The invention discloses an image recognition system and method capable of recognizing object states and preventing missing detection, and relates to the technical field of image recognition systems. The object state recognition omission-prevention image recognition system and method can preliminarily judge whether the object is a detection object through the preliminary judgment module, process and extract data such as the length, the width and the included angle of the object to be detected through the image processing module so as to facilitate subsequent data matching, further accurate recognition is performed on the basis of fuzzy images, the detection object can be conveniently recognized through the object recognition module, and the current state of the detection object can be output according to the matched three-dimensional model state(CN115424189A).
An input portion with an entrance and an output section with at least two output paths, each with its own exit, can be found in an animal sensing system. Through the entrance, an animal can enter the animal sensing system. One or more animal characteristics may be detected by a sensor located inside the input section's sensing region, and the sensor can then relay the characteristic(s) to the central processing unit. Using the received data, the central processing unit can classify the animal according to the characteristic or characteristics it has detected. Based on the classification, it can then operate a gate or other directional guide in the animal sensing system to direct the animal within it, granting it access to only one of the output paths at a time. Based on the classification, the animal may be guided to a desired place by using a single output line to exit the animal sensing system (US11617353B2).
A computer processing system-connected video camera has a target region placed inside its field of vision. A computer program designed for animal identification quickly identifies a target animal by utilizing convolution neural networks, deep learning algorithms, and camera photos. Using a learning algorithm and related machine learning technologies, the animal identification computer program is trained to reliably identify target animals. A deterrent against a target animal can be deployed in two seconds or less from the moment of detection, giving the animal very little time to cause harm to the target location (US11369106B2). A method, comprising: receiving an image of a scene including one or more animals; defining a zone boundary for each of the one or more animals; evaluating each of the region boundaries for suitability for further processing based at least in part on a predetermined set of parameters; and determining at least one of: (i) based at least in part on the further processing, a physical state of at least some of the one or more animals, and (ii) an identity of at least some of the one or more animals (CN112655019A). The Systems and procedures for performing behavioral detection using 3D tracking and machine learning (US10121064B2): Various implementations of the invention are described for the systems and procedures for behavioral detection using three-dimensional tracking and machine learning. The invention, in one embodiment, consists of a classification application that instructs a microprocessor to: identify a sequence of frames of image data that includes depth information; determine the poses of the subjects; extract a set of parameters describing the poses and movement of the subjects, at least the primary and secondary subjects; and use a classifier trained to discriminate between a plurality of social behaviors to detect a social behavior carried out by at least the primary subject and involving at least the second subject.
Summary of the Invention
Our goal is to build a crop protection from animals using Artificial Intelligence. This system is called “SOWING SAFETY: THE DAWN OF AI-ENABLED SCARECROW SOLUTIONS FOR SUSTAINABLE FARMING”. This system detects the animals entering into the crop field and plays an extermination sound that is most feared by the animal. We are using YoloV3 Algorithm that is a Real time live object detection algorithm that detects the animals in the live video. cv2 module is used to capture video. Playsound module is used to play extermination sound. The YoloV3 Algorithm is used to identify animals and their names using the coco.names data set. Its accuracy surpasses that of current systems.
Detailed Description of the Invention
"Sowing Safety: The Dawn of AI-Enabled Scarecrow Solutions for Sustainable Farming" encapsulates the fusion of traditional agricultural practices with cutting-edge technology. In this concept, AI serves as a pivotal tool in revolutionizing pest control methods, particularly in scarecrow technology. By integrating AI algorithms with scarecrow systems, farmers can create dynamic, adaptive solutions that respond in real-time to pest threats, thereby enhancing crop protection and promoting sustainable farming practices. This innovative approach holds promise for reducing reliance on harmful pesticides, optimizing resource usage, and safeguarding both crop yields and environmental health.
In this new approach, AI emerges as a game-changer, providing a dynamic and proactive defense mechanism. By leveraging AI algorithms, scarecrow systems become intelligent entities capable of real-time monitoring and decision-making. These systems are equipped with sensors that detect pest activity and environmental conditions, feeding this data into AI models for analysis. Through machine learning, these models can discern patterns in pest behavior, predict potential threats, and optimize scarecrow responses accordingly.
The result is a revolutionized pest control strategy that is both responsive and sustainable. Rather than relying solely on static scarecrows, farmers now have access to adaptive solutions that evolve with changing conditions. When a pest threat is detected, AI-enabled scarecrows can employ a variety of tactics, such as sound, motion, or even visual deterrents, tailored to deter specific pests effectively. Moreover, these systems can adjust their strategies over time based on feedback and new data, continuously improving their efficacy.
One of the most significant advantages of AI-enabled scarecrow solutions is their potential to reduce reliance on harmful pesticides. By providing an alternative method for pest control, these systems offer farmers a way to minimize chemical usage while still protecting their crops. This not only benefits environmental health by reducing chemical runoff and pollution but also promotes the long-term sustainability of agriculture by preserving soil and water quality.
Furthermore, by optimizing scarecrow responses based on real-time data, AI-enabled solutions can help farmers make more efficient use of resources. By targeting pest threats precisely when and where they occur, these systems minimize wasted effort and resources, ultimately leading to increased crop yields and profitability.
Overall, "Sowing Safety" represents a promising frontier in sustainable farming practices. By marrying traditional wisdom with cutting-edge technology, AI-enabled scarecrow solutions offer a glimpse into the future of agriculture, where innovation and sustainability go hand in hand. As these technologies continue to evolve and become more accessible, they hold the potential to transform not only how we protect our crops but also how we steward the land for generations to come.
The proposed system is to build a crop protection from animals using Artificial Intelligence. This system is called “SOWING SAFETY: THE DAWN OF AI-ENABLED SCARECROW SOLUTIONS FOR SUSTAINABLE FARMING”. This system detects the animals entering into the crop field and plays an extermination sound that is most feared by the animal. We are using YoloV3 Algorithm that is a Real time live object detection algorithm that detects the animals in the live video. cv2 module is used to capture video. Playsound module is used to play extermination sound. We are using coco.names data set for identifying animals and their names if detected by YoloV3 Algorithm. Its accuracy is more compared to existing systems.
The function playsound is contained in the playsound module. The path to the file containing the sound you want to play is the only argument it needs. This could be a URL or a local file. Setting it to False will cause the function to execute asynchronously. The second input, block, is optional and has a default value of True. If you set it to False, the function will execute asynchronously. WAVE and MP3 have been tried and tested to function on Windows. Different file formats might also function. In this invention, when an animal is recognized, the warning sound is output via playsound. It's simple to attach an alarm sound audio file with play sound. The output alert sounds immediately once it is turned on with the help of the Playsound library.
The Simpleaudio provides features to verify if a file is still playing in addition to playing WAV files and NumPy arrays. A Windows-only application called Winsound enables you to play WAV files and activate your speakers. Cross-platform WAV file playback is made possible by bindings for the PortAudio library provided by python-sounddevice and pyaudio. A real-time object detection system called YOLOv3 (You Only Look Once, Version 3) can recognize particular things in pictures, videos, or live feeds. To identify an object, YOLO employs features that a Deep Convolutional Neural Network has learned. YOLO versions 1-3 were developed by Ali Farhadi and Joseph Redmon. Version 3 of YOLO was developed in 2018, two years after the initial version was produced in 2016. It is covered in detail in this page. An enhanced form of YOLO and YOLOv2 is called YOLOv3. YOLO is implemented with the deep learning packages Keras or OpenCV. COCO dataset, meaning “Common Objects In Context”, is an assortment of challenging, high quality datasets for computer vision, the majority of which use cutting-edge neural networks. Additionally, a format utilized by those datasets is named after this format. Firstly, using webcam animal will be identified with the help of openCV. Using yolov3 algorithm animal will be detected. Using coco names dataset name of the animal will be identified after identifying the animal an extermination sound is produced by the playsound module. If animal is detected sound is produced. Otherwise, terminate the process.
Sowing safety: the dawn of ai-enabled scarecrow solutions for sustainable farming is a system that repels the wild animals that are trying to enter the field and exterminates them by playing the sound that they fear off. So, Conclusion: By playing different repulsive noises, we can identify and turn the animals before they enter the field. These days, the issue of wild animals damaging crops has grown to be a significant social issue. Put another way, every farmer should be conscious of and mindful of the fact that animals are living things that require protection from potential pain when employing his or her crop production. It needs immediate attention as well as a workable solution. We save manpower and agricultural damage by doing this. For farmers, this innovation is incredibly cost-effective and helpful. The module safeguards farms and poses no threat to humans or animals. Therefore, this discovery has a great deal of social significance since it will assist farmers in safeguarding their crops, save them from suffering large financial losses, and spare them from having to put up with fruitless efforts to defend their fields. This guarantees that crops are completely safe from animal harm.
4 Claims & 1 Figure
Brief description of Drawing
In the figure which are illustrate exemplary embodiments of the invention.
Figure 1, The Process of Proposed Invention , Claims:The scope of the invention is defined by the following claims:
Claim:
1. A system/method to detect the animals using the Artificial Intelligence based Deep Learning algorithms, said system/method comprising the steps of:
a) The system starts with datasets collection from various cameras (1), from that all the attributes given to make the datasets (2).
b) The proposed invention is incorporated preprocessing steps (3), to identify some of the important predictable images (4), the filter data is featuring extraction process (5), the image is matched and accuracy metric was compared in (6), then finally the if it is identified as an animal then alarm sound is played (7).
2. As mentioned in claim 1, the invented system starts with various images and image dataset uploading to start the process.
3. According to claim 1, the preprocessing will initiate to remove the noisy data from the dataset and it will trigger feature extraction process of YOLOv3 algorithms to split the data into training and testing part.
4. According to claim 1, the proposed invention will start from YOLOv3 functions, then this will be matched with captured figure and detects the animals and type of animals using YOLOv3 architecture based deep learning algorithms.
| # | Name | Date |
|---|---|---|
| 1 | 202441049928-REQUEST FOR EARLY PUBLICATION(FORM-9) [29-06-2024(online)].pdf | 2024-06-29 |
| 2 | 202441049928-OTHERS [29-06-2024(online)].pdf | 2024-06-29 |
| 3 | 202441049928-FORM-9 [29-06-2024(online)].pdf | 2024-06-29 |
| 4 | 202441049928-FORM FOR STARTUP [29-06-2024(online)].pdf | 2024-06-29 |
| 5 | 202441049928-FORM FOR SMALL ENTITY(FORM-28) [29-06-2024(online)].pdf | 2024-06-29 |
| 6 | 202441049928-FORM 1 [29-06-2024(online)].pdf | 2024-06-29 |
| 7 | 202441049928-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [29-06-2024(online)].pdf | 2024-06-29 |
| 8 | 202441049928-EDUCATIONAL INSTITUTION(S) [29-06-2024(online)].pdf | 2024-06-29 |
| 9 | 202441049928-DRAWINGS [29-06-2024(online)].pdf | 2024-06-29 |
| 10 | 202441049928-COMPLETE SPECIFICATION [29-06-2024(online)].pdf | 2024-06-29 |