Classification of Multiple Emotional States from Facial Expressions in Mice using a Deep Learning-Based Image Analysis

PLOS ONE(2023)

Cited 0|Views6
No score
Abstract
Facial expressions are widely recognized as universal indicators of underlying internal states in most species of animals, thereby presenting as a non-invasive measure for predicting physical and mental conditions. Despite the advancement of artificial intelligence-assisted tools for automated analysis of voluminous facial expression data in human subjects, the corresponding tools for mice still remain limited so far. Considering that mice are the most prevalent model animals for studying human health and diseases, a comprehensive characterization of emotion-dependent patterns of facial expressions in mice could extend our knowledge on the basis of emotions and the related disorders. Here, we present a framework for the development of a deep learning-powered tool for classifying mouse facial expressions. We demonstrate that our machine vision was capable of accurately classifying three different emotional states from mouse facial images. Moreover, we objectively determined how our classifier characterized the differences among the facial images through the use of an interpretation technique called Gradient-weighted Class Activation Mapping. Our approach is likely to facilitate the non-invasive decoding of a variety of emotions from facial images in mice. ### Competing Interest Statement The authors have declared no competing interest.
More
Translated text
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined