The Story of EmploTrack
When workplace monitoring hits an absolute prohibition β and how to change course
Fictional scenario β based on realistic situations
The Trigger
How it started
EmploTrack had developed an innovative product: SentimentPulse analyzed facial expressions and voice tones to measure stress levels. HR departments were enthusiastic β finally objective data about employee wellbeing instead of subjective surveys.
Until the AI Act came into force. Article 5(1)(f) explicitly prohibits inferring emotions from biometric data in the workplace. No exceptions. EmploTrack's core product was suddenly illegal.
"We thought we were helping employees by preventing burnout. The regulator saw it differently: we were violating their fundamental rights."
The Questions
What did they need to find out?
Why is emotion recognition in the workplace prohibited?
The team dove into the background of the prohibition. The legislator recognized something fundamental: in the workplace there is always a power imbalance. An employee cannot freely say "no" to monitoring by the employer.
π‘ The insight
Consent is illusory in an employment relationship. If your boss asks if you want to participate in emotion recognition, you feel pressure to say yes β regardless of your actual comfort. That's why the legislator chose an absolute prohibition instead of consent as legal basis.
π Why this matters
The DPA confirmed this in its report: the unequal power relationship makes free and informed consent from employees illusory. This principle also applies to educational institutions, where the same power imbalance exists between student and institution.
What makes biometric data so sensitive?
EmploTrack had thought of facial expressions as "ordinary" data. The legal analysis showed this was a mistaken understanding: biometric data for identification or categorization are special personal data under the GDPR.
π‘ The insight
Biometric data are uniquely identifying and immutable β you cannot reset your face like a password. The inferred emotions are moreover "highly intimate and privacy-sensitive data." This activates the strictest regime of data protection.
π Why this matters
The AI Act classifies all systems that use biometric data for categorization automatically as high-risk. And in the workplace it goes further: then it is simply prohibited, regardless of how well your compliance processes are set up.
Can the technology even reliably detect emotions?
Besides the legal objections, the team dove into the scientific basis. The findings were sobering: the Data Protection Authority called the technology "dubious and risky," built on "contested assumptions."
π‘ The insight
The technology doesn't measure emotions β it measures physical signals and links them to assumed emotions. A high heart rate can be stress, but also excitement or coffee. A loud voice can be anger, but also enthusiasm. The proxy coupling is fundamentally unreliable.
π Why this matters
Research shows that emotion recognition systems assign more negative emotions to people with darker skin β a direct discrimination risk. The idea that emotions are universal and expressed the same by everyone is scientifically contested. Cultural, individual and contextual differences are ignored.
Is there a legal path to employee wellbeing monitoring?
The team had to redesign their product. Was there a way to monitor employee wellbeing without violating the prohibition?
π‘ The insight
Yes, but fundamentally different. Anonymous, aggregated surveys where employees voluntarily participate. No biometrics, no individual tracking. Focus on team-level insights instead of per-person stress scores. The value shifts from surveillance to listening.
π Why this matters
The DPA points to the "fireworks analogy": the AI Act ensures products are safe, but the societal question of whether we want it is a different consideration. EmploTrack had to ask themselves: even if this was technically possible, do we want to monitor employees this way?
The Journey
Step by step to compliance
Product-market success
SentimentPulse was adopted by multiple enterprises. HR departments were enthusiastic about "objective wellbeing data."
The AI Act check
During AI Act compliance preparation, it turned out that emotion recognition in the workplace is explicitly prohibited under Article 5(1)(f). No exceptions.
Scientific reflection
The team investigated the scientific basis. The findings: "dubious and risky," built on "contested assumptions" with discrimination risks.
Product transformation
From biometric surveillance to anonymous, voluntary wellbeing surveys. Focus on team-level insights, not individual scores.
Customer communication
Existing customers were informed about the legal change and migration to the new product. Some left; others valued the ethical course change.
New positioning
EmploTrack positioned itself as "privacy-first wellbeing platform" β a differentiator in a market full of surveillance tools.
The Obstacles
What went wrong?
β Challenge
Core product turned out to be prohibited under AI Act Article 5(1)(f)
β Solution
Complete product transformation to anonymous, voluntary surveys
β Challenge
Scientific basis of emotion recognition was "dubious and risky"
β Solution
Accept that technology is fundamentally unreliable and look for alternatives
β Challenge
Losing existing customers who wanted surveillance features
β Solution
Repositioning as privacy-first platform, attracting new customers who value ethics
We thought we were helping employees by measuring them. We learned that listening β with respect for privacy β is more effective than surveillance.
The Lessons
What can we learn from this?
Some applications are simply prohibited
Not everything is a compliance matter. Some AI applications are illegal, period. Check Article 5 first.
Consent doesn't work everywhere
In power relationships (employer-employee, school-student) free consent is illusory.
Ask the scientific question
Can the technology even do what we claim? For emotion recognition: probably not.
Compliance can be a differentiator
Privacy-first positioning can be a competitive advantage in a market full of surveillance.
Does your organization use biometric analysis?
Check if your application falls under an AI Act prohibition before investing further.
Ga verder met leren
Ontdek gerelateerde content