I. Introduction
In Australia, 95% of the population are concerned about animal welfare with 91% desiring better ways to monitor and improve animal welfare (National pest & disease outbreaks, 2018). There is increased understanding and focus on animals’ sentience and the need to ensure farm animals have ‘lives worth living’. This applies to the Australian egg industry where research shows the community believes that hens are entitled to positive and rewarding lives (Edgar et al., 2013). Good hen welfare also promotes better egg quality and productivity (Sossidou et al., 2009). Typically, components of bird welfare can be assessed by regularly capturing birds for taking of blood and feather samples or through in-person visual observations. These methods are intrusive; they interrupt the shed, imposes stress on captured birds, are not continuous and are prone to sampling and measurement bias (by the observer). It is also difficult to communicate findings to the public because this approach requires thorough analysis of the correlations between visual observations and welfare issues. A potentially more effective way is to observe and analyse bird behaviours from video streams captured by mounted cameras. Without human interruption, the birds behave naturally. Although this approach has many benefits (such as avoiding sampling/measurement bias and disruption and can be done in realtime and continuously), developing and refining systems to analyse video contents require expertise and extensive data analysis and algorithm development. The skill lies in building an automated computer vision system that can assess flock status from the collated evaluation of individual bird behaviours and affected states. Such an automated system supports hands-free and continuous monitoring of flock activities and welfare. This both supports animal welfare assessment and provides a readily communicable and objectively measurable suite of information for the public showing effective welfare and management. The system can also monitor egg productivity through the evaluation of various production-sensitive behaviours, such as feeding and nesting.
II. Method
Our system is based on object detection and tracking techniques developed within computer vision research. The success of accurate individual behaviour identification requires massive video data sets. To collect this data our team first set up cameras on a commercial egg farm in NSW and recorded video footage between 15/03/2021 to 16/06/2021. We mounted two 4K cameras to cover 4 eating and 10 drinking areas (see Figure 1 below). Both cameras had zoom to allow focus on fine motion monitoring of individuals. In our case study, we defined two activities: eating and drinking. Through our observation of the video footage, we translated the two terms to facilitate the annotation for the computer vision model. Specifically, eating and drinking were defined as a bird puts its head inside the feeder or the drinker and this action is maintained for at least a set period. The observed area is illustrated in Figure 1. We densely annotated over 3700 instances of these defined activities (see Table 1), to collate training data that we used to develop a deep-learning-based behaviour detection model (YOLO-X (Ge et al, 2021). A validation set was kept from the training data and used to test the trained model. We obtained 95.4% accuracy of detection and identification of these three activities. This is more than adequate for real-time monitoring at the flock level. We further pre-set the eating and drinking areas (see the red circle and rectangle areas in Figure 1) to further reduce the false alarm rate. However, the detection model can identify the behaviours that appear in a static image, but cannot guarantee if an individual bird performs the action in the video footage. Thus we applied a Kalman filter (Wang et al., 2020) to track individual birds within a frame sequence. Based on the object detection and tracking technique, we are able to use the trained artificial intelligence (AI) model to monitor and auto-analyse individual bird behaviours.
Table 1 - Details of training and testing dataset for bird behaviour classification
Figure 1 - The observable areas in the shed by the PTZ camera
III. Results
Based on the trained models for detecting and tracking individual birds, we conducted a case study on video footage taken on 14/06/2021. We first plotted the distribution of feeding time between 12:00 PM and 1:00 PM in Figure 2. We can see that within the 1-hour observation, most birds spent 2-4 minutes eating, with fewer than 60 birds taking more than 6 minutes to eat. In Figure 3, we report the number of birds drinking and eating in different periods of an entire day. From the two distributions, we can see that between 4:00 AM and 6:00 PM, the number of birds eating is mostly constant. The peak period for drinking occurs between 11:00 AM and 12:00 PM. These natural variations provide useful baselines for monitoring deviations to normal behaviours. The baselines can be further refined with more data and more variety in farm conditions (e.g, shed, density, breed, age and production system etc.)
Figure 2 - The average feeding time distribution within 1 hour, where the horizontal axis is the feeding time and the vertical axis is the number of the birds counted
Figure 3 - The number of birds drinking and eating within the observation area in an entire day, where the horizontal axis represents the time of the day and the vertical axis is the number of birds counted
IV. Discussion
In this paper, we have demonstrated a practical application of advanced computer vision and machine learning techniques to poultry production. Our system monitors individual bird behaviours on commercial egg farms using a low-cost and robust setup. In the preliminary study, each camera can observe 100 hens’ individual behaviours, covering around 25 m2 area in a shed. The system is suited to continuous monitoring of bird and flock welfare and has application for identifying production-related problems that have behavioural signals (such as reduced eating). Together, the welfare and production monitoring and early warning system present as a practical, low-cost, labour-free system to support commercial egg production in Australia. We plan to improve the system by defining and identifying more complex individual bird and flock activities, which may require the consideration of temporal and spatial dependency (i.e. what activities are happening where and when — and what environmental, management or climatic changes are related to these changes).
ACKNOWLEDGEMENT: The authors are grateful to Australian Eggs for their financial support of this study.
Presented at the 33th Annual Australian Poultry Science Symposium 2022. For information on the next edition, click here.