The SAR Challenge
A Subject Access Request (SAR) under GDPR Article 15 requires you to provide a complete copy of all personal data you hold about an individual. You have 30 days.
For a typical SaaS application, this means querying 5-15 tables, aggregating the results, and formatting them in a readable way. Do it manually once and it takes an hour. Get 10 SARs per month and it becomes a full-time job.
The Manual Approach (and Why It Fails)
-- Query each table separately
SELECT email, name, phone FROM users WHERE user_id = 'user-123';
SELECT street, city, zip FROM profiles WHERE user_id = 'user-123';
SELECT order_id, shipping_address FROM orders WHERE user_id = 'user-123';
SELECT ip_addr, user_agent, created_at FROM sessions WHERE user_id = 'user-123';
Problems: you'll forget a table, there's no audit trail of the request, and the output isn't structured.
Automated SAR with pgcomply
Quick Inspection
SELECT pgcomply.inspect('user-123');
This queries every table registered in the PII registry for the given user ID and returns a consolidated view of all their personal data. The output is grouped by table, showing exactly what data you hold.
Machine-Readable Export
For Article 20 (data portability), the data must be in a structured, commonly used, machine-readable format:
SELECT pgcomply.export_user_data('user-123', 'json');
Returns structured JSON:
{
"export_type": "gdpr_article_20",
"subject_id": "user-123",
"exported_at": "2026-02-20T14:30:00Z",
"data": {
"users": [{"email": "alice@example.com", "name": "Alice M."}],
"profiles": [{"street": "Musterstr. 42", "city": "Berlin"}],
"orders": [{"order_id": "ORD-001", "shipping_address": "..."}],
"_consent": [{"purpose": "newsletter", "status": "active", "granted_at": "..."}]
}
}
The _consent section is automatically included, showing all consent records — useful for demonstrating your lawful basis.
Audit Trail
Every inspect() and export_user_data() call is logged in the immutable audit trail:
SELECT * FROM pgcomply.audit_log
WHERE event_type IN ('inspect', 'data_export')
AND details->>'subject_id' = 'user-123';
This proves to an auditor that you responded to the request and when.
Building a SAR Workflow
For production use, wrap pgcomply in an internal API:
-- 1. Log the SAR request
INSERT INTO internal.sar_requests (subject_id, requested_at, deadline)
VALUES ('user-123', NOW(), NOW() + INTERVAL '30 days');
-- 2. Generate the export
SELECT pgcomply.export_user_data('user-123', 'json');
-- 3. The audit trail captures the export automatically
-- 4. For Pro users: generate a certified SAR response
SELECT pgcomply.sar('user-123');
The Pro sar() function generates a formatted response document with all data, consent records, processing purposes, and retention periods — ready to send to the data subject.
Handling Edge Cases
User has data in multiple databases: pgcomply handles one PostgreSQL instance. For multi-database architectures, call inspect() on each instance and aggregate externally.
User requests data in a specific format: JSON is the default and satisfies Article 20. If the user requests CSV, convert the JSON output.
User's identity cannot be verified: Log the request, respond within 30 days asking for verification. Do not release data without verifying identity.
Request is manifestly excessive: You may charge a reasonable fee or refuse, but you must explain why within 30 days.
Summary
Subject Access Requests are a recurring operational burden. pgcomply.inspect() and export_user_data() automate the data collection across all PII tables, produce structured exports, and log everything in the audit trail. What used to take an hour per request now takes one SQL call.