Digital Public Infrastructures (DPIs), such as digital identity, payment, and data exchange systems, are increasingly central to how societies function. They promise efficiency, inclusion, and innovation. But they also carry profound risks for privacy, human rights, and democracy.
This course has shown that privacy is not an obstacle to DPI – it is the foundation of public trust. Without safeguards, DPIs risk becoming tools for surveillance, exclusion, and control. With privacy-by-design and strong governance, they can empower individuals and strengthen societies.
<aside>
📌
DPI must not just promise privacy. It must prove it through technical implementation and oversight.
</aside>
🔐 Core Principles
- Trust and Predictability
- Trust is the most valuable resource for DPI. Once lost, it is hard to regain.
- Users need predictable interactions: revealing age should not mean being profiled, logging into a service should not expose unrelated personal data.
- Data Protection as Human Protection
- Data protection is not about protecting data – it protects the people behind it.
- Technology can protect or violate human rights. Safeguards are a tool to make technology compliant with values, but they can’t replace human rights protections enshrined in law and administrative processes.
🪖 Technical Foundations
- IT-Security is the baseline. Confidentiality, integrity, and availability are preconditions for privacy. Without strong security, data breaches and cyberattacks undermine entire societies.
- Encryption (public–private key cryptography) enables secure communication, selective disclosure, and trust in DPI. It is the backbone of privacy-preserving infrastructures.
- Identifiers & Biometrics are powerful but risky. Unique persistent identifiers or centralized biometric databases create lifelong vulnerabilities and enable profiling.
🎱 Privacy Principles in Practice
- User Control and Consent: People must know who is asking for their data (symmetrical identification), be able to disclose only what’s necessary (selective disclosure), and see and contest their data history (privacy dashboard).
- Unlinkability, Unobservability & Zero-Knowledge Proofs: Technical safeguards prevent tracking across services, surveillance and allow proving facts (e.g., age) without revealing identity.
- Privacy-by-Design: Building systems from the ground up to minimize data collection for a clearly defined purpose. Do not repurpose data without user consent. Decentralize storage, and embed security by default. Privacy-by-design is not just technical preferences, it’s an ethical imperative.
🚨 Risks to Watch Out For
- Data misuse & function creep (e.g., Aadhaar used for surveillance)
- Data breaches (e.g., 55M voter records leaked in the Philippines)
- Tracking & profiling (e.g., China’s social credit ecosystem)
- Over-identification & oversharing (e.g., mass ID uploads for UK online services)
- Identity theft & fraud (e.g., leaked NINs in Nigeria used in scams)