Computer and Information SciencesOther

Abhishek Tiwari

Abhishek Tiwari
Diary of a Tech Savant and Servant Leader - All things technology, product, and engineering leadership.
Home PageAtom Feed
language
Published

Privacy in data systems has traditionally focused on protecting sensitive information as it enters a system - what we call input privacy. However, as systems become more complex and capable of inferring sensitive information from seemingly harmless data, the importance of output privacy has gained significant attention.

Published

Secure multi-party computation (SMPC) enables organisations to collaborate on sensitive data analysis without directly sharing raw information. However, seemingly harmless aggregate outputs, particularly private set intersection (PSI), can leak individual-level information when analysed strategically over time.

Published

Multi-touch attribution is considered as holy grail in advertising industry. As advertisers are targeting users with multiple advertisements across different platforms and publishers, understanding how each of these touch points contributes to conversion is crucial—but this understanding has traditionally come at the cost of user privacy.

Published

Safeguarding individual privacy inherently means data minimisation i.e. limiting the collection and disposal of data. This principle has been a cornerstone of privacy advocacy and is even enshrined in regulations like the EU's General Data Protection Regulation (GDPR). However, a research published by Ponte et. al is challenging this fundamental assumption, introducing what they call the "Where's Waldo effect.

Published

Homomorphic encryption is a powerful cryptographic technique that allows computations to be performed on encrypted data without decrypting it first. This blog post will introduce the concept of homomorphic encryption and demonstrate implementations using Python. What is Homomorphic Encryption? Homomorphic encryption is a form of encryption that allows specific types of computations to be carried out on ciphertext.

Published

Tech companies and large consumer businesses are grappling with how best to protect end-user data while maintaining pace of innovation and competitive edge. Two distinct approaches have emerged: top-down and bottom-up privacy. Understanding these approaches is essential for anyone involved in privacy engineering, product development, or driving tech policy decisions.

Published

In last post we covered, Privacy Preserving Measurement (PPM) and discussed how Distributed Aggregation Protocol (DAP) works. Today, we'll explore how to implement a simplified version of the DAP using Python with Prio3 as our Verifiable Distributed Aggregation Function (VDAF). This implementation will support multiple clients, demonstrating how DAP can aggregate data from multiple sources while maintaining privacy.

Published

In 1982, Andrew Yao proposed the Millionaire Problem which discusses how two millionaires can learn who is richest one without disclosing their actual wealth. They solve this problem by comparing their wealth using secure two party computation to ensure that they learn only the richest one and nothing else is revealed. The problem was later generalised for secure multiparty computation by Goldreich et al in 1987.

Published

The promise of differential privacy is compelling. It offers a rigorous, provable guarantee of individual privacy, even in the face of arbitrary background knowledge. Rather than relying on anonymization techniques that can often be defeated, differential privacy works by injecting carefully calibrated noise into computations.

Published

Differential Privacy (DP) is a mathematical framework that protects individual privacy in data analysis while allowing useful insights to be extracted. It works by adding carefully calibrated noise to data or query results, ensuring that including or excluding any single individual's data doesn't significantly change the analysis outcomes.