AI in your own cloud, under your own GDPR controls.
The first GDPR question is usually not about the model. It is about the boundary. Where does the data live? Who is the controller? Who is the processor? What new transfer risk did we just create by turning on AI? Invisibles is built to make those questions easier to answer, not harder. The software deploys into your own AWS or Azure account, under your IAM, inside your security boundary. There is no shared customer infrastructure and no standing access for Invisibles to your data. Your Salesforce tokens live in your own secrets store. Your logs stay in your environment. Your existing regional choices for data residency, retention, and access control can continue to apply.
Controller, processor, and Article 28.
Under GDPR, role clarity comes first. In most deployments, you remain controller because you determine the purposes of the processing. Invisibles acts as processor to the extent it processes personal data on your documented instructions, with the software running inside your own cloud environment. That maps cleanly to Article 28, where controllers must use processors that provide sufficient guarantees around technical and organisational measures.
The practical advantage is that Invisibles is not asking you to move your customer data into a new multi-tenant SaaS perimeter just to use AI. The runtime sits in your AWS or Azure account. Access is governed by your IAM. Secrets stay in your own AWS Secrets Manager or Azure Key Vault. That does not remove your Article 28 obligations, but it does make them easier to document in a DPA, easier to explain in procurement, and easier to defend in a privacy review.
Lawful basis still belongs to the customer.
Invisibles does not create a new lawful basis for processing. You still need one. If you are using AI to support customer service, internal operations, fraud prevention, or contract performance, the lawful-basis analysis remains yours as controller. If you are relying on consent for a specific use case, that consent still needs to be captured, recorded, and respected in the underlying business process.
What changes with Invisibles is the control surface. Prompts, Skills, Data Context Mappings, Agents, and Audit give you a governed way to decide what data is exposed to a use case, what gets masked, what gets logged, and who can invoke it. That matters for Article 5 principles like purpose limitation and data minimisation. A Prompt should not see more than it needs. A Skill should not be callable by everyone. A Data Context Mapping should pin the exact fields in scope.
Data residency, international transfers, and Schrems II.
A lot of GDPR anxiety around AI is really transfer anxiety. If personal data leaves the EEA, what is the transfer mechanism? Are SCCs needed? What supplementary measures exist? How does this hold up after Schrems II? Invisibles reduces the number of moving parts in that conversation. The software runs in the region and cloud account you choose. If you already operate in an EU AWS or Azure region, Invisibles can run there too. Your logs, tokenization store, and application runtime stay in that environment.
That does not make transfer analysis disappear. If you choose a model provider or sub-processor that involves a transfer, you still need to assess that path and put the right contractual and technical measures in place. The architecture helps because you are not automatically creating a second copy of your operational data in a vendor-controlled environment. Structured tokenization, field-level masking, and customer-controlled deployment are all relevant supplementary measures when privacy teams assess cross-border risk.
DSARs, erasure, and evidence.
GDPR compliance is not only about where data sits. It is also about what happens when a person exercises rights. Access, deletion, correction, restriction, and portability requests do not go away because AI is involved. Most companies need to answer a simpler question first: did the AI system process this person’s data, and if so, where is the evidence?
This is where the audit model matters. Invisibles writes an immutable audit trail with six-year retention by default, exportable to CSV, S3, or Splunk. That gives privacy, security, and legal teams a record of what Prompt or Skill ran, through which channel, and under what controls. For right-to-erasure workflows, the important point is that Invisibles is designed to avoid creating unnecessary shadow copies. If the source record is deleted or updated in your system of record, the AI layer does not need to become a second long-term repository of personal data.
Article 22 and human oversight.
Article 22 is where many AI projects get over-interpreted. Not every AI-assisted workflow is solely automated decision-making with legal or similarly significant effects. But some use cases can move in that direction, especially if a company starts using AI to score, rank, approve, deny, or route people in ways that materially affect them. That is where governance matters.
Invisibles is designed to support human-in-the-loop deployment patterns. Agents can surface recommendations, summaries, and next steps without forcing full automation. Skills can be permissioned. Audit records can show who invoked what and when. That does not by itself answer every Article 22 question, but it gives you the primitives needed to keep meaningful human review in the process and to document that review in a DPIA under Article 35 where required.
GDPR requirement to product mechanism.
Article 28 processor controls map to customer-cloud deployment, customer IAM, and DPA support. Article 32 security of processing maps to field-level masking with AWS Comprehend or Microsoft Presidio, structured tokenization with a 15-minute TTL in DynamoDB or Cosmos DB, prompt-injection defenses, and immutable audit. Article 35 DPIA work maps to clear system boundaries, logging, and the ability to show exactly which fields are exposed through Data Context Mappings. Article 22 concerns map to human oversight patterns, permissioned Skills, and audit evidence. International transfer concerns map to regional deployment in your own cloud plus customer choice over model providers and transfer mechanisms. DSAR and erasure workflows map to avoiding unnecessary copies and keeping evidence in the audit trail.
Questions privacy teams ask.
Is Invisibles the controller or the processor under GDPR?
In most deployments, you remain controller because you determine the purposes of the processing. Invisibles acts as processor to the extent it processes personal data on your documented instructions, with the software running in your own cloud.
Does Invisibles solve GDPR compliance for us?
No. It supports your existing GDPR program. You still own lawful basis, notices, retention rules, and DPIA decisions. Invisibles is designed to fit those controls instead of creating a separate unmanaged AI perimeter.
Can we keep the deployment inside the EU?
Yes, if you choose EU AWS or Azure regions. The application runtime, logs, and supporting stores run in the region you select inside your own cloud account.
How does Invisibles support DSARs and deletion requests?
By avoiding unnecessary copies and preserving evidence. Your systems of record remain authoritative, and the immutable audit trail helps show when and how AI processing occurred for each data subject.
What about Article 22 automated decision-making?
Risk depends on your use case, not on the label 'AI'. Invisibles supports human oversight, permissioned Skills, and auditability so you can keep significant decisions reviewable and document that review.
Do you offer a DPA?
Yes. A Data Processing Addendum is available on request — typically within 2 business days. The deployment model is designed to make processor obligations easier to document.
This page is for informational purposes only and is not legal advice. A Data Processing Addendum is available on request; email security@invisibles.app. Customers should review their specific obligations with their own privacy, legal, and compliance counsel.
Want our team on a call with your privacy counsel?
Book 30 minutes. Bring whoever needs to be in the room — privacy, security, procurement. We walk through the architecture and answer specific questions against your residency and transfer requirements.