AInora

Home/AI Teammate/GDPR-Compliant

Geo wedge - GDPR

GDPR-Compliant AI Teammate: EU Data Residency, DPA, Article 22

A GDPR-compliant AI teammate stores call audio, transcripts, and customer memory in EU regions only, signs a Data Processing Agreement under GDPR Article 28, does not train models on customer conversations, and labels automated interactions in line with Article 22 transparency requirements. Ainora is built that way by default.

This page describes Ainora as a GDPR-compliant product

It does not offer compliance review, audit, or legal advisory services. CTAs on this page are “book a demo” or “call the demo number” - not “schedule a compliance audit”. Consult qualified legal counsel for compliance decisions specific to your organisation.

See providers compared →

EU regions only
No US transfer, no enterprise upsell required
Source: GDPR, EUR-Lex
DPA on request
Signed under GDPR Article 28 for every customer
No training
Customer conversations never used to train any model
Per-tenant
Workspace-scoped memory, no cross-tenant leakage

Why Are Most “GDPR-Compliant” AI Agents Only Conditionally Compliant?

The GDPR was published in 2016 and applied from May 2018 (Regulation 2016/679, EUR-Lex). Almost every AI agent vendor selling into Europe today claims to be GDPR-compliant. The claim varies in substance.

EU region availability is not the same as EU residency by default. Most US-built AI agent platforms offer an EU storage region - on the enterprise plan. Self-serve and mid-market customers often default to US regions and have to negotiate EU residency as a contractual addendum. The European Commission's legal framework on EU data protection makes clear this is a meaningful legal distinction.

A “GDPR-ready” claim is not the same as a signed Data Processing Agreement. GDPR Article 28 requires controllers (the customer) and processors (the AI vendor) to have a written DPA covering the scope of processing, sub-processors, security measures, and termination. Some vendors will sign one; some will not.

Sub-processor disclosure is uneven. GDPR requires processors to disclose their own sub-processors and obtain controller authorisation. Some AI agent vendors disclose; some do not.

Model training on customer data is the silent default at many vendors. Unless the vendor explicitly contracts that customer conversations will not be used to train any model, the default is often that they will.

Automated decisioning under Article 22 is rarely addressed concretely. GDPR Article 22 limits decisions made solely by automated processing that produce legal or similarly significant effects. Most AI agent vendors do not discuss it on their marketing pages. See the Article 22 navigable summary on GDPR.eu for the regulatory text.

What Does GDPR Compliance Actually Require for Voice AI?

GDPR is a long regulation. For AI voice and ops agents specifically, five articles dominate the practical picture. Detailed guidance is published by the European Data Protection Board (EDPB guidelines index).

Article 6 - lawful basis. Every processing activity needs a documented lawful basis: consent, contract, legal obligation, vital interest, public interest, or legitimate interest. For an AI agent taking a customer call to handle a service request, “performance of a contract” is typically the basis; for outbound calls, consent or legitimate interest with a documented balancing test.

Article 22 - automated decision-making. A data subject has the right not to be subject to a decision based solely on automated processing that produces legal effects. Voice agents that screen, schedule, summarise, and escalate to humans do not trigger Article 22 for the screening itself, because the binding decision stays with a human. The vendor must support that human-in-the-loop pattern; the customer must use it.

Article 28 - processor obligations. A signed Data Processing Agreement covering processing scope, security measures, sub-processors, sub-processor change notification, audit rights, and return-or-deletion at termination.

Article 32 - security of processing. Encryption in transit and at rest, access controls, periodic testing.

Article 44 onward - international transfers. Personal data leaving the EEA needs a transfer mechanism: adequacy decision, Standard Contractual Clauses, or Binding Corporate Rules. The cleanest answer for a European AI agent is: do not transfer. Keep the data in the EU.

How Is Ainora GDPR-Compliant by Design?

EU regions by default

Audio, transcripts, memory, and embeddings stored in EU regions for every workspace. No US transfer. No "enterprise upgrade" gate.

Per-tenant isolation

Each workspace runs scoped tools and memory. No cross-tenant leakage of audio, transcripts, or customer data.

No model training on customer data

Conversations are not used to train Ainora's models or any underlying model. Contractually committed in the DPA.

Signed DPA on request

Every customer gets a Data Processing Agreement under GDPR Article 28 with sub-processor list and audit rights.

Article 22 transparency

Inbound callers hear an automated-call disclosure. Voice agents screen and escalate; binding decisions stay with human teammates.

Honest Read: “EU Region Available” vs “EU-Default for All”

US enterprise platformsDACH vendorsAltisAinora
EU regionAvailable on enterprise planYesOn requestDefault for all customers
DPA signed under Article 28On enterprise planYesOn requestYes, every customer
Sub-processor disclosureOn requestYesLimitedYes, on request
No training on customer dataVariesGenerally yesYesYes, contractual
Article 22 disclosure on callsInconsistentYes (voice product)N/A (no voice)Yes
Per-tenant isolationEnterprise tierYesYesYes

Comparison reflects publicly available product positioning as of 2026-05-05. Sources: each vendor's own product pages and DPAs.

Honest framing

Several vendors run a clean GDPR posture on the enterprise plan and a US-default posture on smaller plans. The wedge for European mid-market is “EU residency without negotiating the enterprise tier.” That is what Ainora ships.

Where Does GDPR Posture Determine the Vendor Choice?

Frequently Asked Questions

EU regions on Google Cloud and AWS Frankfurt, depending on the workload. The specific region is named in the Data Processing Agreement.

No. Conversations are not used to train Ainora's models or any underlying model. The commitment is contractual in the DPA.

Yes - every customer receives a signed DPA on request, with sub-processor list, security annex, and audit rights.

Article 22 limits decisions based solely on automated processing that produce legal or similarly significant effects. Ainora's voice agents screen, schedule, summarise, and escalate to humans; binding decisions (granting credit, denying a claim, terminating a contract) remain with the human teammates the customer designates. Used that way, Article 22 does not block the deployment.

A clear automated-call disclosure at the start of the call. Phrasing is configurable per language and per customer regulator.

Not yet. We do not claim "SOC 2 ready" because we are not yet audited. SOC 2 is on the roadmap. Our compliance posture today leads with GDPR, EU residency, per-tenant isolation, and no-training-on-customer-data - not with US security frameworks.

Yes, on request alongside the DPA. The list covers the cloud regions, the underlying language model providers, and any other entity that processes customer data on our behalf.

JB
Justas Butkus

Founder & CEO, AInora

Building AI digital administrators that replace front-desk overhead for service businesses across Europe. Previously built voice AI systems for dental clinics, hotels, and restaurants.

View all articles

Ready to try AI for your business?

Hear how AInora sounds handling a real business call. Try the live voice demo or book a consultation.