Consumers and corporations are driven to engage in a digital world they cannot adequately trust. We are developing paradigms to enable online commerce and machine learning in ways that provide privacy and protect user identities, through such concepts as local differential privacy, federated machine learning, identity brokering, and blockchain technology.
Privacy, Identity and Trust
Personal Data Control
Many facets of modern life have stressed the need for standards on personal data protection. For example, advertising networks track user behavior and interests using sophisticated and stealthy techniques. The impact of and compliance with GDPR and privacy policies has yet to be studied. Data breaches, data sharing relationships, and big data analytics impact privacy in unprecedented ways.
Privacy on the internet is a right. Users should have the ability to exercise control over data they share in an effort to balance privacy and convenience. Formal privacy policies should be verifiable. Ideally, potential inferences and consequences of sharing particular pieces of data would be identified and communicated to the user. We aim to develop new ID protocols that would aid in protecting personal data.
Safe Identity, Authorization, and Verifiable Claims
We need a new approach to establishing and verifying identity in the digital world. People rely on individual usernames and passwords to authenticate with each organization. These organizations inevitably store a variety of information about their users. This data often includes globally linkable attributes (name, DOB, etc.) This exposes a large attack surface for the user’s data, which puts them at risk for serious privacy loss. In an ideal world, users can create unforgeable and unlinkable digital identities for authentication with organizations. In these cases, claims about and authorizations for a user can be issued. Then, the user can associate verifiable information about these claims and authorizations to identities they have created. We aim to develop new ID protocols that will meet the goals of security, privacy verifiability, and usability for people and organization.
Social Trust and Reputation
A degree of trust is needed when we engage in business transactions, especially in peer-to-peer transactions or in the sharing economy (e.g., Airbnb, Uber, Blablacar, eBay, etc.). A web-of-trust (WoT) is one approach to establishing trust in a large community based on social & transactional graphs and the idea of some level of transitivity of trust. We aim to replace existing siloed, app-specific user reputation systems with a global WoT that establishes a stronger notion of trust that can be used across applications. Scalability, robustness, subjectivity, and privacy will be addressed, as will new use cases such as allowing users to rely on their most trusted connections (explicitly or implicitly determined) to handle verification of their identity or wishes in the context of password recovery, digital legacy, etc.
Privacy-Preserving Analysis of Security Data
We need to honor our customer’s privacy while still collecting data that will drive worthwhile security analytics and machine learning. Presently, some organizations refuse to submit telemetry they consider sensitive. Traditional access control and security controls provide a measure of protection from misuse or theft of data. But emerging techniques provide stronger protection guarantees. Using principled systems architecture (applying anonymization, thresholding, encoding, shuffling, trusted hardware, etc.) helps ensure that a breach of any centralized telemetry store doesn’t leak sensitive information. Applying local differential privacy (LDP), which moves the trust boundary to the data owner where the data is perturbed before submission, could translate to customers confidently contributing their data to improve security analytics. Furthermore, emerging federated machine learning paradigms may allow training models that benefit from global data but are also customized to their particular context, all while limiting the potential of sensitive data leakage.