There is a lot of confusion around AI technologies.
We hope that this short summary (which by no means is meant to be comprehensive) sheds some light on what we do and what we do not do.
It highlights what is allowed and what is not when it comes to data and privacy. The goal is to open up a conversation and invite those who have more questions to speak with us.
Facial Analysis vs Face Recognition
Data from our ethical facial analysis service cannot be used to identify individuals.
The advanced IoT sensors by Zenus do not store or transmit videos or images. It is a cutting-edge piece of technology with built-in privacy protections.
The advanced IoT Sensors and AI models by Zenus analyze the audience without using unique identifiers and without storing or transmitting video locally or in the cloud.
If someone wants to use other technologies to identify individuals and combine the data, they need to obtain explicit consent first.
This is true of hotels, convention centers, event organizers, technology companies, etc. Otherwise, they will likely expose themselves to liabilities.
GDPR and Data Privacy Regulations
Different regions and implementations have different requirements.
The European Data Protection Board, in particular, has clearly noted that facial analysis alone does not fall under Article 9.
See section 80 in the Guidelines adopted on January 29, 2020 [link].
“However, when the purpose of the processing is for example to distinguish one category of people from another but not to uniquely identify anyone the processing does not fall under Article 9.”
See section 14 in the Guidelines adopted on April 26, 2023 [link].
“The mere detection of faces by so-called “smart” cameras does not necessarily constitute a facial recognition system either. […] they may not be considered as biometric systems processing special categories of personal data, provided that they do not aim at uniquely identifying a person […] .”
In simple words. Are you using the service alone? Great.
Are you combining it with identifying information? You need to obtain consent first.
Our AI badge scanning reads attendees IDs
To avoid confusion, our ethical facial analysis technology consists of passive and anonymous data collection.
The independent and separate AI badge scanning technology is an opt-in-based service. These should not be conflated with one another.
Now that this is clear, it is worth noting that we drive the adoption of badges with only the attendee ID encoded in the QR code. This maximizes privacy and ensures that our system does not receive personal information.
Moreover, nobody is scanned without knowingly standing before the kiosks and presenting their badge.
The data owner may cross-reference attendee IDs to registration records for individuals who opted in. The process for obtaining consent is the same as with all badge-scanning services and workflows.
Likewise, the responsibility for consent and disclosures on who receives the data and how it is used belongs to the data owner. These are established practices and regulations.
Legal vs Moral Considerations. Consent vs Notice
People often conflate face recognition (identification) with facial analysis (anonymized data). In a similar way, they conflate legal and moral considerations.
It might not be legally required to provide notice about the use of facial analysis in many settings. But we still think it is morally a good idea to do so in the spirit of transparency and education.
Therefore, we ask our clients to post signage on-site, talk about the use of our service in their marketing communications, and include it on their online terms and conditions.
Most facial analysis settings do not require consent. It is practically impossible to enforce it without the identification of individuals. Notice is also not legally required in most settings but is recommended.
What about consent versus notice? Advance notice we love. Consent might defeat the purpose of anonymity.
How could one exclude a person from the anonymous analysis (if they opt-out) without identifying them? They cannot.
Aggregate vs Individual Analysis
The chances that one would analyze a person’s face or body language for a few seconds and infer their psychological state are slim.
On the other hand, analyzing a room of people multiple times per second and combining this with survey and attendance data can be insightful.
Likewise, assessing a single person’s reaction to inform life-changing events such as a job
interview or loan application should never happen (sadly, humans do it often).
Facial analysis is accurate in group settings and with the appropriate context. Unlike facial recognition, it does not use unique biometric identifiers to distinguish individuals. It is entirely anonymous and non-identifying.
On the contrary, measuring the responses of hundreds or even thousands of people to infer their experience and adjust programming accordingly is much safer. Higher certainty and lower risk.
Like most things in life, the use of technology is not black or white across the board — context and safeguards make all the difference.
Concluding remarks
Our ethical facial analysis brings organizations valuable and actionable data without crossing the line into collecting personally identifiable information.
It is a rare example of technology using restraint. It is an example of building proactive privacy safeguards by default. It is an example to follow.
We have turned down many revenue opportunities that did not align with our values — this came at a personal cost for every one of our team members.
Our commitment to the industry and our community is this.
We will continue innovating and pushing the entire industry forward with the help of our friends and partners.
We will do this to the best of our ability and with the utmost care and protection of people’s privacy.
Our team is reachable and available to speak with everyone who wants to learn more about what we do and how we do it. Contact us here.