Timeline
| Date | Milestone |
|---|---|
| Aug 2024 | EU AI Act enters into force |
| Feb 2025 | Prohibited AI practices apply |
| Aug 2025 | General-purpose AI rules + governance |
| Aug 2026 | High-risk AI system obligations (Annex III) |
| Aug 2027 | Full enforcement for all AI systems |
Risk Categories
The EU AI Act classifies AI systems by risk level:Unacceptable Risk
Banned. Social scoring, real-time biometric surveillance, manipulative AI. These systems are prohibited.
High Risk
Strict requirements. AI in healthcare, finance, HR, education, law enforcement. Must meet transparency, documentation, and data governance standards.
Limited Risk
Transparency obligations. Chatbots, AI-generated content. Must disclose AI involvement to users.
Minimal Risk
No requirements. Spam filters, AI in games. Most AI applications fall here.
Key Requirements for AI Systems
Data Minimization (Article 10)
Data Minimization (Article 10)
Requirement: Training and input data must be relevant, representative, and limited to what is necessary.Risk: Sending full user conversations to LLMs includes far more personal data than necessary for the AI task.With Blindfold: Tokenize PII before AI calls. The LLM receives only the information needed for its task — personal identifiers are replaced with anonymous tokens.
Transparency (Article 13)
Transparency (Article 13)
Requirement: AI systems must be transparent about how they process data. Users must be informed when interacting with AI.With Blindfold: Audit logs document exactly what personal data was detected, what was anonymized, and what was sent to the AI provider. This creates a clear record for transparency requirements.
Documentation & Record-Keeping (Article 11-12)
Documentation & Record-Keeping (Article 11-12)
Requirement: High-risk AI systems must maintain technical documentation and log all operations.With Blindfold: Every API call is logged with entity types detected, policy used, timestamp, and region. Export these logs for regulatory documentation.
Data Governance (Article 10)
Data Governance (Article 10)
Requirement: Data used in AI systems must meet quality, relevance, and privacy standards.With Blindfold: The
detect() method lets you audit text for personal data without modifying it — useful for data governance reviews and quality checks.Human Oversight (Article 14)
Human Oversight (Article 14)
Requirement: High-risk AI systems must allow human oversight and intervention.With Blindfold: Tokenization is reversible — humans can always see the real data via
detokenize(), while the AI only works with anonymized versions. This maintains human oversight of the actual information.How Blindfold Maps to the EU AI Act
| AI Act Requirement | Article | Blindfold Feature |
|---|---|---|
| Data minimization | Art. 10 | tokenize() removes PII before AI input |
| Transparency | Art. 13 | Audit logs document all PII processing |
| Documentation | Art. 11-12 | Export audit logs for regulatory records |
| Data governance | Art. 10 | detect() audits data for PII |
| Human oversight | Art. 14 | detokenize() restores data for human review |
| Data protection by design | Art. 10 | SDK-level PII protection in your pipeline |
High-Risk AI Systems
The EU AI Act imposes stricter requirements on AI systems in these domains:- Healthcare
- Financial Services
- Human Resources
- Education
AI Act Classification: High-risk (Annex III, Section 5)Requirements: Robust data governance, thorough testing, documentation of training data, continuous monitoring.Blindfold Approach:
- Use
region="us"orregion="eu"depending on patient location - Apply
hipaa_us(US patients) orgdpr_eu(EU patients) policy - Tokenize all PHI before clinical AI tools
- Maintain audit trail for regulatory inspections
Code Examples
Data Minimization for AI Calls
Audit Data for PII (Data Governance)
Usedetect() to check datasets for personal data without modifying them:
Relationship with GDPR
The EU AI Act and GDPR are complementary:| Aspect | GDPR | EU AI Act |
|---|---|---|
| Focus | Personal data protection | AI system safety and transparency |
| Scope | Any personal data processing | AI systems operating in the EU |
| Data requirements | Minimize collection | Minimize AI inputs + ensure quality |
| Documentation | Processing records (Art. 30) | Technical documentation (Art. 11) |
| Oversight | Data Protection Officers | AI governance structures |
gdpr_eu policy with the EU region to satisfy both regulations simultaneously. GDPR protects the personal data, while the audit trail satisfies AI Act transparency requirements.
EU AI Act Compliance Checklist
Classify your AI system
Determine if your AI system is high-risk, limited-risk, or minimal-risk under the EU AI Act.
Implement audit logging
Use Blindfold’s audit trail to document what PII was detected and anonymized.
Document your data pipeline
Record how data flows through your system, where PII is detected, and how it’s protected.