A Video Editor's Guide to BIPA, GDPR, and Facial Recognition Compliance
If your video tools use face or voice recognition, privacy laws apply. Here is what BIPA, GDPR, and other regulations mean for video teams, in plain English.
Video tools are getting smarter. Face recognition, speaker identification, voice matching. These features make editing faster, but they also put your team in contact with biometric privacy laws that most video professionals have never heard of.
This is not a legal guide. We are not lawyers and this is not legal advice. But as a team building video tools with face and voice recognition, we have spent a lot of time understanding these regulations, and we think video editors should know the basics.
What counts as biometric data?
Biometric data is any measurement of a physical or behavioural characteristic that can identify a person. In the context of video production, the most relevant types are:
- Faceprints. The mathematical representation a face recognition system creates from someone's face. This is not the photo itself but the computed template used for matching.
- Voiceprints. Similar to faceprints but derived from someone's voice. Speaker diarization and voice matching systems generate these.
- Gait patterns, iris scans, fingerprints. Less common in video editing tools but covered by the same laws.
The critical distinction: a photo of someone's face is generally not biometric data. But running that photo through a face recognition model to create a faceprint is creating biometric data under most of these laws.
If your video tool clusters faces or identifies speakers, it is creating biometric data.
BIPA: Illinois' Biometric Information Privacy Act
BIPA is the most aggressive biometric privacy law in the United States, and it has teeth.
Key requirements:
- Written consent before collection. You must inform someone in writing that their biometric data is being collected, explain the purpose, and get their written consent before processing.
- No sale or profit from biometric data. You cannot sell, lease, trade, or otherwise profit from someone's biometric data.
- Retention and destruction policy. You must have a publicly available written policy for how long you retain biometric data and when you destroy it.
- Reasonable security. You must protect biometric data at least as carefully as you protect other confidential information.
Why editors should care: BIPA has a private right of action. Individuals can sue directly, and damages are $1,000 per negligent violation and $5,000 per intentional or reckless violation. Class action lawsuits under BIPA have resulted in settlements in the hundreds of millions of dollars. Facebook paid $650 million. Google paid $100 million. Clearview AI, BNSF Railway, and others have faced significant judgments.
BIPA applies if any person in the footage is an Illinois resident, regardless of where your company is based or where the processing happens.
GDPR: General Data Protection Regulation
The EU's GDPR classifies biometric data as a special category of personal data, which means it gets the strictest protections.
Key requirements:
- Explicit consent or limited exceptions. Processing biometric data requires explicit consent or must fall under a narrow list of exceptions (employment law, vital interests, etc.). Legitimate interest, which works for many other data types, is generally not sufficient for biometrics.
- Data minimisation. Only process what you actually need. If you only need to detect faces (not identify them), do not create persistent faceprints.
- Right to erasure. Individuals can request deletion of their biometric data.
- Data protection impact assessment. If you are systematically processing biometric data, you likely need a formal assessment of the privacy risks.
Why editors should care: GDPR applies to anyone processing data of EU residents. If your footage includes people from the EU, or if your team is in the EU, GDPR governs how you handle face and voice recognition. Fines can reach 4% of global annual revenue.
Other laws to know about
BIPA and GDPR get the most attention, but the landscape is expanding:
- Texas CUBI (Capture or Use of Biometric Identifier Act). Similar to BIPA but enforced by the state attorney general rather than private lawsuits. Texas recently secured a $1.4 billion settlement against Meta over biometric data practices.
- Washington state biometric identifier law. Requires consent for commercial use of biometric data.
- Colorado, Connecticut, Virginia, and other state privacy laws. Several US states have passed comprehensive privacy laws that include biometric data protections.
- Canada's PIPEDA and provincial laws. Canada treats biometric data as sensitive personal information requiring meaningful consent.
The trend is clear: more jurisdictions are regulating biometric data, not fewer. Building compliant practices now saves pain later.
What this means for video teams in practice
If your workflow involves face recognition or voice identification, here are the practical implications:
Consent is the foundation
Before processing footage through face or voice recognition, you need consent from the people in the footage. For corporate videos where talent signs releases, add biometric data collection to the release. For documentary or event footage where getting consent from every person on screen is impractical, consider whether you actually need face recognition on that footage.
Storage and security matter
Where biometric data is stored and who can access it are central concerns under every major regulation. Cloud-based processing raises additional questions: which jurisdiction are the servers in? Who has access to the data? How long is it retained?
Retention policies
Do not keep biometric data longer than you need it. Define a retention period, document it, and stick to it. When a project wraps and you no longer need the face clusters, delete them.
Vendor responsibility
Using a third-party tool for face recognition does not shift the compliance burden entirely to the vendor. As the data controller (the person deciding to process the footage), you share responsibility for ensuring compliant practices.
How FrameQuery's architecture helps
We designed FrameQuery with these regulations in mind. The local-first architecture is not just a performance decision, it is a privacy decision.
Processing is ephemeral. When FrameQuery processes your footage, lightweight proxies are sent to our servers for AI analysis. The proxies are deleted after processing. Face embeddings and voice data are computed during processing and stored only in your local index file, not on our servers.
Your data stays on your machine. The search index, including any face clusters and voice data, lives on your local disk. We do not have access to it. We cannot sell it, share it, or lose it in a breach, because we do not have it.
Deletion is straightforward. Deleting biometric data means deleting your local index file or removing specific entries from it. You do not need to submit a request to a cloud provider and hope they actually purge it from their backups.
No server-side retention of biometric data. During processing, face and voice analysis happen on our servers, but the resulting embeddings are delivered to your machine and purged from our infrastructure. The biometric data's resting place is your local disk, under your control.
This does not make compliance automatic. You still need consent from the people in your footage. You still need retention policies. You still need to understand which laws apply to your situation. But local-first architecture removes an entire category of risk: the risk that your biometric data is sitting on someone else's servers, subject to their security practices, their data handling, and their business decisions.
Practical steps for video teams
- Talk to a lawyer. Seriously. This post is an overview, not legal advice. If you are processing biometric data commercially, get proper legal counsel.
- Update your talent releases. Add language about biometric data processing if you use face or voice recognition in your workflow.
- Document your practices. Write down what biometric data you collect, why, how you store it, and when you delete it. BIPA specifically requires a publicly available written policy.
- Choose tools carefully. Understand where your tools send data and what they retain. A cloud platform that stores faceprints on their servers creates a very different risk profile than a local tool where the data never leaves your machine.
- Minimise what you collect. If you do not need face recognition for a project, do not run it. Data you do not collect cannot be mishandled.
Privacy regulations are not going away. Video teams that build compliant practices now will be ahead of the curve. Join the waitlist to see how local-first video search handles biometric data by design.