Cengiz Cetinkaya
Former HSLU Data Science Master Student
Idea and project
Cengiz's Master's thesis takes us to a trough full of ideas for animal welfare
A project is underway about launching a "Happy Cow" quality label that cattle breeders can use to groom the public’s perception of their operations.
And anyone wanting to learn more about their four-legged best friend’s mental state can use the "waggydog" app from the start-up FaunaAI. The first version of the app for dogs is available (app download), and prototypes for cows, horses and cats are already in the pipeline.
Knowing more about cows leads to better decisions
We’ve always valued cows for their meat and milk. But recent studies show that they also exhibit a broad range of emotions and social behaviours, including separation anxiety. Despite the proven emotional complexity of animals, the industry still often inflicts undue pain on them because we don’t fully fathom their emotional state. While some positive steps are underway, such as the Obsalim method for diagnostics and feed adjustment, neither science nor the industry has systematically focused on the wellbeing of cows.
Cengiz and his team are convinced that better understanding cows’ emotions will allow us to develop more appropriate and sustainable practices in agriculture. Thus, we can become more mindful of how we consume meat and aware of the environmental impact of livestock, areas where machine learning can make a difference.
Procedure and models
Linking GoPro images with actual emotions in real-time
Cengiz started by collecting as many high-quality images and videos of cows as possible. Animals were more likely to accept a GoPro camera in barns and on meadows than on panic-causing drones flying over their heads. He also obtained images from the two Demeter farms Gut Rheinau and Kloster Schönthal Langenbrugg and internet collections that included videos of cows on the way to the slaughterhouse.
Cows and their “happy” and “unhappy” emotional states.
In a workshop with experts and based on a concept of the Research Institute of Organic Agriculture (FiBL), Cengiz developed a framework for identifying the mood in cows and then labelled the collected images and videos as either “happy” or “unhappy.”
Framework for training the image data
He then trained two models with the data on the cow's mental state – one that obtains data from the face and head position while the other one gets it from the body posture. The next step meant turning the whole thing into a prototype for an app that combines facial recognition and body posture and can therefore recognise the emotions of cows in real-time.
Applied emotion recognition in an app (prototype).
From input to output: a look at the models
Cengiz wanted to train AI models so that they can predict cow happiness as reliably as possible when applied to any image of the animal. To increase accuracy, Cengiz combined two approaches: gauge the animal’s emotions by using facial recognition and by reading its body posture. He divided his approach into three steps: labelling the images, training the model, and evaluating the output.
Based on the trained data, the model recognises cows in pictures and classifies them as “happy” or “unhappy.
Capturing emotions from the head position
You only live once! And you definitely only look once. This applies especially when you use YOLOv5, an object recognition model that works in real-time and has layers with deep learning capabilities, referred to as a convolutional neural network. Cengiz fed it with his data by using the integrated PyTorch framework. The model for the cows was trained from scratch using the YOLOv5 algorithm, without any transfer learning to extend its pre-trained object recognition. This made it possible to recognise cows’ emotions independently based on their head position.
Step 1: Prepare the data
The first step involved sorting the pictures for training the model by drawing frames around the cow heads and labelling the faces as “happy” or “unhappy.” The frames enabled the YOLOv5 model to recognize the position of the head. Each frame used for training was then linked to a file with information about the head in relation to the overall image.
Frames and linked files indicate the position of the head in relation to the image.
Step 2: Train the model
The basic YOLOv5 model serves as the starting point of the training process. It’s a simple but effective model that’s already been trained on hundreds of images. Training can take between 5 and 20 hours, depending on variables such as epochs, stack size, and the capacity of the system’s graphics processing unit. It’s important to cache the data after every ten iterations so that previous models are retained and can be used if necessary.
Step 3: Evaluate the output
Machine learning can help to determine the emotions of cows based on their head posture with 70% to 80% accuracy. Training included 236 samples and a test set of 59 samples, a comparatively small number of images that limits accuracy. But as soon as larger data sets and more iterations are available, the model could also be used in agricultural operations, e.g. to statistically record cows’ emotions over time and award the "Happy Cow" label on reaching a certain threshold.
The confusion matrix (CM) is the basis for an AI benchmark, with a two-dimensional array of numerical values with "true" and "false" classifications.
Linking emotions with body position
Cengiz used not only facial recognition but also added a second approach to make predictions based on a cow’s posture. Here, he used two deep learning models, one that recognises the presence of a cow and its posture in an image and the other that identifies the cow's emotions based on its posture.
Step 1: Prepare the data
Cengiz first marked 16 points on the cow's body, e.g. on the jaw and hooves. He then analysed the postures on cow pictures and awarded 200 labels for “happy” cows and 200 for “unhappy” ones, thus obtaining 400 pictures for training purposes.
Fixing 16 points on the cow's body as basis for determining the posture.
Step 2: Train the model
Cengiz fed the labelled cow posture data into the model by using DeepLabCut, a method for estimating positions in 2D and 3D. DeepLabCut is based on learning transfer and known for its accuracy in estimating animal postures.
Once the cow postures had been trained, Cengiz transformed the data with the position points and the “happy” and “unhappy” labels into a NumPy array, i.e. a 3D matrix programmed with Python and containing information about localising and interpreting an element. The output image then shows the posture and the corresponding emotion:
Step 3: Evaluate the output
The model of emotion recognition based on the posture of a cow was 97% accurate.
Application and results
From happy cow to happy dog
Anyone walking past a meadow will soon be able to take images of cows and learn more about their current state. That’s because Cengiz has turned his models into a prototype for an app that can recognise cow emotions and breeds using a supervised learning machine model that is being trained on labelled images. In other words, farmers and others will be able to query and find information about cows by using the app.
This approach promises a wealth of potential, and a team of former students and renowned researchers from HSLU, the Technical University of Munich (TUM) and the Massachusetts Institute of Technology (MIT Peter Gloor) is now working with the startup FaunaAI to develop commercial applications. The idea is to learn more about the emotions of not only cows but also of dogs, cats and horses.
Here’s a look at the app for dogs:
The "waggydog" app is already available for download:
Behind the scenes
Challenges and potential
The accuracy of a machine learning model depends largely on the quantity and quality of the images per category, and obtaining and preparing them proved to be more challenging than finding the right methods and creating a prototype. After all, although data is very valuable, it must first be collected and prepared.
For example, Cengiz had to find a way to collect images of cows expressing negative emotions naturally rather than in response to a scare or surprise. Images from slaughterhouses were generally not available, and he therefore had to rely on internet searches for the emotional (fear and panic) images, which often were of poor quality and thus would affect the model he used for the training.
To continuously improve the quality of the model, he had to incorporate many more images into the data set by using automated crawlers and by contacting farmers who have access to the most authentic images of cows in their everyday lives. He also mentions another fundamental aspect in his work that concerns our ability to reliably classify animal emotions for the purpose of producing training data and labelling emotions.
We don’t have any standardised procedures yet for classifying such emotions and thus have to rely mostly on the experience of farmers and experts. One possibility is to study this issue by working closely with dairy farmers who are especially empathetic towards their livestock. More opinions and classifications from experts could increase the reliability of the emotional labels and thus improve the training data and the output of the AI models.
We would like to thank Cengiz Cetinkaya and Peter Gloor for sharing these fascinating insights into this groundbreaking project!
Author & Masterstudent: Cengiz Cetinkaya
Supervising lecturer: Mr. Peter Gloor, MIT Massachusetts Institute of Technology
Innovative researcher, scientist, lecturer & entrepreneur
Data Science Professional Portrait
Flyer Master Data Science HSLU
Programme Information | Contact | Info-Events
Join us and visit our info-events and find out more details in a personal conversation with the Head of the Programme.
Please contact us for an individual and personalised advice:
Tel.: +41 41 228 42 53 / E-Mail: master.ids@hslu.ch
Find more information here:
- Course structure and modules
- Admission and registration
- Professional portraits & study insights
- Working and studying
- Generalist profile
- FAQ