AI and deep learning: Responsible use of facial expression analysis

AI and deep learning: Responsible use of facial expression analysis

The European Parliament adopted its negotiating position on the first-ever regulation of AI. In this blog post, we explore how to use facial expression analysis responsibly and how it can contribute to scientific research.

Posted by

Jason Rogers

Published on

Wed 12 Jul. 2023

On June 14, 2023, the European Parliament adopted its negotiating position on the AI Act, the first-ever regulation of artificial intelligence (AI). The goal of the act is to ensure that AI developed and used in Europe adheres to EU rights and values, namely those related to human safety, privacy, transparency, non-discrimination, and social and environmental well-being. The AI Act is expected to go into effect in 2024.

The act establishes differing rules based on four levels of perceived risk: unacceptable, high, limited, and minimal. While the act prohibits certain AI practices, such as social scoring and intrusive biometric identification, it is crucial to note that the European Parliament was clear about the use of such technologies for academic research, stating on page 64 in their response to the European Commission:

“It is therefore necessary to exclude from its scope AI systems specifically developed for the sole purpose of scientific research and development and to ensure that the Regulation does not otherwise affect scientific research and development activity on AI systems.”

In this blog post, we will explore how to use facial expression analysis responsibly and how it can contribute to various fields of scientific research.

Ethical considerations of facial expression analysis

Before exploring the contributions of facial expression analysis, it is important to note that Noldus Information Technology agrees with the AI Act in regards to use in unacceptable and high-risk systems, such as emotion recognition in law enforcement or border management.

Noldus does not allow the use of its facial expression analysis tool, FaceReader, in a number of applications, including:

  • Active defense, meaning the application in weapon systems. For example, to support a pilot in a fighter plane or an operator controlling a weaponized drone.
  • Biometric data collection in a judicial context such as a police interrogation. FaceReader cannot and shall not be used as a ‘lie detector’.
  • Surveillance of people in public spaces for security purposes.  
  • Any other use that may potentially violate fundamental human rights.

FaceReader is not capable of recognizing or identifying faces or people, and therefore unsuitable for surveillance purposes. Furthermore, Noldus disapproves of the use of the technology without explicit prior consent from the individuals whose faces are being captured. How this consent is obtained may differ, depending on the application in which FaceReader is being used, but the responsibility for acquiring consent rests with the organization applying the software.

Find out more about our ethical policy regarding facial expression analysis on our website.








FREE WHITE PAPER: FaceReader methodology

Download the free FaceReader methodology note to learn more about facial expression analysis theory.

  • How FaceReader works
  • More about the calibration
  • Insight in quality of analysis & output

How does facial expression analysis work?

Facial expression analysis involves the study and interpretation of facial movements and expressions to create meaningful insights. Noldus’ FaceReader 9 uses several trained models to derive facial expressions. First, using a face finding algorithm, the position of the face within an image (static or dynamic, i.e., video) is found. From there, a neural-network based face analysis model processes the image(s) through a series of layers.

These layers were trained using deep learning, a type of machine learning in which an artificial neural network is trained on a large amount of data and used to make predictions on new data. The training process involves iteratively adjusting the model's parameters based on feedback to improve prediction accuracy until no further improvements are observed.

Discover more FaceReader facts in this blog post.

Deep learning in FaceReader

In the case of FaceReader 9, the training and testing of the network1 is done on a large collection of diverse facial images manually annotated by human (FACS) experts. The result is an output of the seven “basic” expressions or emotions: happy, sad, angry, surprised, scared, disgusted, and neutral. Additionally, the optional Action Unit Module evaluates the engagement of twenty individual muscles, or action units.

Over the past decade, FaceReader has been used in many peer-reviewed scientific publications, from children’s reactions to moral transgressions2 to responses to odors3 to predicting advertising effectiveness4, to name a few. Since its initial launch, there have been over 1,700 publications using FaceReader. Furthermore, FaceReader has been shown to be accurate and at parity with human coders5.

Supporting scientific innovation

It should be noted that the European Parliament's President, Roberta Metsola, began the press conference announcing the adoption of the AI Act by reading a statement prepared by ChatGPT. This, of course, was done to illustrate the point that everyone must be cautious with Generative and other AI technologies. 

The European Parliament's recognition of the importance of exemptions for research activities is commendable. These exemptions facilitate scientific innovation and enable researchers to leverage facial expression analysis tools to explore new frontiers of knowledge. At the same time, they provide safe environments for testing and refining AI models before their deployment, striking a balance between innovation and citizen protection.

Contributions of facial expression analysis

Facial expression analysis, when implemented responsibly and ethically, has the potential to significantly contribute to scientific research, innovation, and the overall betterment of society. By understanding and interpreting facial expressions, researchers can deepen their understanding of human behavior and enhance human-computer interaction.

Want to learn more about facial expression analysis? Check out one of our on-demand webinars on FaceReader!

The AI Act's adoption by the European Parliament demonstrates a commitment to strike the right balance between harnessing the positive potential of AI technologies while ensuring the protection of human rights, privacy, and non-discrimination.

As negotiations continue, it is crucial to foster dialogue and collaboration among policymakers, researchers, and technology developers to ensure that facial expression analysis remains a valuable tool for scientific advancement, while upholding the highest ethical standards. By doing so, we can maximize the benefits of facial expression analysis while safeguarding individual rights and societal well-being.

References

  1. Gudi, A.; Tasli, M.; Den Uyl, T.; Maroulis, A. (2015). Deep Learning based FACS Action Unit Occurrence and Intensity Estimation. International Conference on Automatic Face and Gesture Recognition (FG2015), https://doi.org/10.1109/FG.2015.7284873.
  2. Dys, S.; Malti, T. (2016). It's a two-way street: Automatic and controlled processes in children's emotional responses to moral transgressions. J Exp Child Psychol, 152, 31-40, https://doi.org/10.1016/j.jecp.2016.06.011.
  3. He, W.; Boesveldt, S.; De Graaf, C.; De Wijk, R. (2014). Dynamics of autonomic nervous system responses and facial expressions to odors. Front. Psychol, 5-110, https://doi.org/10.3389/fpsyg.2014.00110.
  4. Lewinski, P.; Fransen, M.; Tan, E. (2014). Predicting advertising effectiveness by facial expressions in response to amusing persuasive stimuli. Journal of Neuroscience, Psychology, and Economics, 7(1), 1-14. https://doi.org/10.1037/npe0000012.
  5. Küntzler, T.; Höfling, T.; Alpers G. (2021). Automatic Facial Expression Recognition in Standardized and Non-standardized Emotional Expressions. Front. Psychol. 12:627561. https://doi.org/10.3389/fpsyg.2021.627561.

Related Posts

3 Emotional studies with FaceReader
31 Oct human behavior research Emotion

3 Emotional studies with FaceReader

Many researchers have discovered FaceReader as a tool for their research. These 3 recent studies with FaceReader show how emotion data helps you to better understand human-human, human-machine, and human-product interactions.
What is RPPG?
17 Oct human behavior research Emotion

What is RPPG?

Nowadays, measuring heart rate and heart rate variability can be done remotely, without all kinds of devices being attached to the test participant, using remote photoplethysmography (RPPG). What is RPPG and how does it work?
Consumers' food choices and emotions
07 Nov human behavior research Emotion

Consumers' food choices and emotions

Is there a relationship between food choice and a person’s mood? Bartkiene et al. examined the factors that influence our food choice, using facial expression analysis.