Explore
Communities in English
Advertise on Engormix

Automated Flock Density and Movement Estimation for Welfare Monitoring in Commercial Egg Farms

Published: October 23, 2023
By: L. YU 1, J. XU 1, R. SHEPARD 2, Q. WU 1, R. JENNER 3 and J. ZHANG 1 / 1 School of Electrical and Data Engineering, University of Technology Sydney; jian.zhang@uts.edu.au; 2 Herd Health Pty Ltd; richard@herdhealth.com.au; 3 Rosetta Management Consulting Pty Ltd. rod_jenner@hotmail.com
Summary

The monitoring of poultry behaviour provides the opportunity to aid egg production and animal welfare. With the current development in machine learning and computer vision, automated content analysis becomes a practical way for low-cost and continuous monitoring of animal behaviours. In this paper, we design a simple, yet effective flock monitoring system based on deep learning techniques for egg farmers that allows them to reduce labour yet improve performance. This work shows it is possible to auto-analyse flock activities thereby providing early warning of welfare issues, by applying object detection, segmentation, tracking and dense counting techniques. With the model trained on the labelled data from the collected video footage, the system can achieve 94% accuracy then automatically generate the activity reports.

I. Introduction

It is expected that the global demand for livestock products will increase by 70% by the year 2050 (Gerland et al., 2014). As one of the most high-protein and environmentally friendly sources, egg production is an important human food source. Intensive egg production enables the production of cheap, nutritious and readily available human food; however, one of the challenges is to ensure the production systems also can meet the birds' needs, including comfort, health and positive experiences. It can be challenging to monitor bird welfare in large egg farms (typically 25,000-50,000 birds per shed), being labour intensive and requiring expertise. An increasing number of Precision Livestock Farming (PLF) technologies are being examined for this purpose (Dawkins, et al., 2013). Specifically, auto-monitoring systems, equipped with artificial intelligence (AI) models, bring the potential for non-intrusive and continuous monitoring in practically applied solutions for egg farms. Effective auto-analysis is expected to bring welfare and performance improvements to commercial egg production.
Summaries of bird activity and their distribution reflect bird behaviour. These summaries in turn reflect the flock’s welfare status. Specifically, the density and movement patterns of birds within the flock give reliable information on the welfare status of the flock. For example, the real-time monitoring of density and movement can give early warnings of smothering, where several hens pile up (often resulting in suffocation). To observe these and other important flock activities, we developed a low-cost and easy-use system based on the recent AI techniques, to auto-estimate the density and movement of birds within the shed.
This paper aims to introduce the design and implementation of the auto-monitoring and analysis system. Due to the early stage of the project, our case study is conducted in the visual condition of natural daylight environment. This is because the significant changes of illumination largely affect the model performance when training on the limited data. Based on the numerical case study, we further discuss the potential of the commercial value of the wide deployment of our system in egg farms.

II. Method

Our system is based on the object detection, tracking and instance segmentation techniques in computer vision, where the core component is a data-driven model. The effectiveness of the monitoring and analysis is heavily dependent on collecting massive video (learning) data sets from sheds within the egg farm. In this section, we describe the settings applied in the case study.

Data collection

Data collection was conducted on a free-range laying hen farm in NSW by setting several cameras on the farm from 15/03/2021 to 16/06/2021. Following the installation procedure of the farm that satisfies the ethics criteria, we positioned two kinds of cameras (PTZ and AXIS) to provide a top-down view (to avoid bird occlusions), then connected to a desktop. The height of the platform was adjustable to allow full coverage of the study space. In our case, each camera can cover 25m2 and 50m2 for indoor and outdoor environments, respectively. The live processed result was then transferred via Wi-Fi and Bluetooth to a central computer to let the manager monitor birds in real-time and to collate data for AI analysis. Videos were high-resolution, recording both indoor and outdoor environments for over 1,000 hours. Figure 1 illustrates our video data collection settings.
Figure 1 - The video data collection settings: indoor camera setup (left), outdoor camera setup (middle), and the cameras used for data collection (right).
Figure 1 - The video data collection settings: indoor camera setup (left), outdoor camera setup (middle), and the cameras used for data collection (right).

Computation backend

To begin, we briefly introduce key technical term definitions as follows: (1) object (bird) detection is the cornerstone in the whole computational framework, which aims to locate each bird in the video frame by computing a rectangular bounding box; (2) instance segmentation, is an extension of bird detection, which is to accurately draw a contour of each bird, providing more details about the bird's motions that are helpful to describe the individual behaviours; (3) object (bird) tracking, draws a trajectory of a bird in a frame sequence of video footage. This term is usually bound with bird detection or segmentation. In this way, the movement of each bird can be tracked and recorded; (4) crowd counting, which estimates the number of birds in an observable area of the camera. This is the key indicator of density.

Implementation details

The computation backend is implemented by training AI models on annotated image data, with advanced computer vision and machine learning techniques. In the data preparation, we densely labelled over 25,000 bounding boxes and contours of birds from 300 sampled frames, where each bounding box or contour describes only one bird. These frames were manually selected at multiple periods in various visual conditions. Table 1 gives the details of the training and validation dataset. For the model training, we applied the recently proposed RetinaNet (Lin et al., 2017) and Mask-RCNN (He et al., 2017) to train object detection and instance segmentation models, respectively. On the validation dataset, the two models achieved 94% and 92% accuracies for the indoor environment. We used the Kalman filter to track every detected bird in the video footage (Wang, et al., 2020), and the trajectory was estimated by linking all positions of a bird in the frame sequence with small time intervals. The visualization of bird detection, segmentation, tracking, and crowd counting is illustrated in Figure 2.
Figure 2 - Visualizations of bird detection, segmentation, tracking, and crowd counting. In (a), (b) and (d), we used bounding boxes, contours and dots marked with different colours to represent individual birds in the observation area. In (c) the red line represents a trajectory of a moving bird within 10 video frames.
Figure 2 - Visualizations of bird detection, segmentation, tracking, and crowd counting. In (a), (b) and (d), we used bounding boxes, contours and dots marked with different colours to represent individual birds in the observation area. In (c) the red line represents a trajectory of a moving bird within 10 video frames.
Table 1 - Overview of the data for model training and validation.
Table 1 - Overview of the data for model training and validation.

III. Results

Based on the trained models, we conducted a case study for bird detection, segmentation and real-time crowd counting in both indoor and outdoor environments.

Indoor environment

We applied the trained model to test a full-day video footage sequence (taken on 18/04/2021), as shown in Figure 3, where a few density peaks (more than 200 birds within the frame) are observed in the entire day. These sudden density increases may trigger alerts for further investigations. Figure 4 shows the visualization of the two methods. The numbers counted by detection and dense counting are 213 and 219, respectively. The reason for the discrepancy is the re-counting in the dense areas of the frame. With faster (local) computer processors, the model can run in real-time, providing live statistics for immediate notifications. Also, when the shooting area is not crowded, both study algorithms gave comparable results. When there is crowding to the point that it is difficult to distinguish the individual boundaries of birds, the crowd counting algorithm shows to outperform 1.5% than object detection in terms of the counting accuracy.
Figure 3: The number of birds observed in different periods (indoor).
Figure 3: The number of birds observed in different periods (indoor).
Figure 4: The screenshot at 7:25 AM with 207 manually counted birds (left). The middle and right images visualize the detection and crowd counting results, respectively.
Figure 4: The screenshot at 7:25 AM with 207 manually counted birds (left). The middle and right images visualize the detection and crowd counting results, respectively.

Outdoor environment

Our two outdoor cameras were mounted on a 4m-high pole, covering a comparably large visible area (approximately 50m2 ). We give the exemplar results (13/05/2021) of crowd counting and the visualization of density estimation at 4:39 PM in Figure 5. In the outdoor case, birds can move freely, so the density changes more significantly during the entire day.
Figure 5 - The number of birds observed in different periods of the entire day (left), the visualization of density estimation based on equal-sized windows (middle) and clustering (right).
Figure 5 - The number of birds observed in different periods of the entire day (left), the visualization of density estimation based on equal-sized windows (middle) and clustering (right).

Region analysis

Region analysis aims to dynamically estimate the density of birds in different regions of an observation area, which is particularly important to monitor the pilling behaviours leading to smothering. Here we propose two solutions based on equal-sized windows and clustering, respectively. The first solution is to partition an observable region with equal-sized windows, thus the density (e.g., number of birds per meter) within each window can be separately estimated. The second solution is to perform the clustering algorithm on the density map of the whole image to segment different regions automatically. The regions with estimated densities are illustrated in Figure 5 (middle and right).

IV. Discussion and conclusion

In this paper, we have proposed applying advanced computer vision and machine learning techniques to monitoring flocks in commercial egg farms. With the automated observation and analysis, the system has the potential to help monitor animal welfare and improve the commercial values of egg production in Australia. This case study mainly aims to observe the flock behaviour of laying hens. However, based on the detection, segmentation and tracking methods, we can further analyse the birds’ individual behaviours such as feather sucking and feature pecking.
ACKNOWLEDGEMENT: The authors are grateful to Australian Eggs for their financial support of this study.
    
Presented at the 33th Annual Australian Poultry Science Symposium 2022. For information on the next edition, click here.

Dawkins MS, Cain R, Merelie K & Roberts SJ (2013) Applied Animal Behaviour Science 145(1-2): 44-50.

Gerland P, Raftery AE, Ševčíková H, Li N, Gu D, Spoorenberg T, Alkema L, Fosdick BK, Chunn J & Lalic N (2014) Science 80(346): 234–237.

He K, Gkioxari G, Dollár P & Girshick R (2017) Proceedings of the IEEE international conference on computer vision 2961-2969.

Lin TY, Goyal P, Girshick R, He K & Dollár P (2017) Proceedings of the IEEE international conference on computer vision 2980-2988.

Wang B, Liu H, Samaras D & Hoai M (2020) arXiv preprint arXiv:2009.13077.

Content from the event:
Related topics:
Authors:
Richard Shephard
Rod Jenner
Recommend
Comment
Share
Profile picture
Would you like to discuss another topic? Create a new post to engage with experts in the community.
Join Engormix and be part of the largest agribusiness social network in the world.