This summer, the creator of ChatGPT, OpenAI, launched a new crypto project, 'WorldCoin,' that offers participants a share of a crypto token in exchange for their biometric data (an iris scan). Worldcoin's mission is said to include developing a 'reliable solution for distinguishing humans from AI online' and creating a 'potential path to AI-funded UBI [universal basic income].'
More than 2 million participants have signed up to WorldCoin since its launch. Accordingly, OpenAI may be processing a huge quantity of biometric data (if the iris scans are used to identify individuals) – which is sensitive ('special category') personal data under the GDPR and UK GDPR. Concerns have already been raised as to whether WorldCoin is harvesting more information from its participants than that to which those participants have consented. WorldCoin claims to host users' data on a decentralised blockchain. Data subjects have the right to withdraw consent for processing health data at any time, and to seek its deletion, but it is not clear whether the WorldCoin blockchain storage approach is compatible with those rights.
On July 31st 2023, the UK Information Commission released a statement, that included the following:
1. 'Organisations must conduct a Data Protection Impact Assessment (DPIA) before starting any processing that is likely to result in high risk, such as processing special category biometric data.'
2. 'Organisations also need to have a clear lawful basis to process personal data. Where they are relying on consent, this needs to be freely given and capable of being withdrawn without detriment.'
3. 'We note the launch of WorldCoin in the UK and will be making enquiries.'
Other European data protection regulators have also commented, with the French regulator describing WorldCoin as 'questionable.'
Originally published 21 August 2023
The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.