We use cookies to offer a better browsing experience, analyze site traffic, personalize content, and serve targeted advertisements. By clicking accept, you consent to our privacy policy & use of cookies. (Privacy Policy)

More than Meets the Eye? The Questions Surrounding Worldcoin

Hardly one week after Worldcoin, the “proof of personhood” crypto project led by ChatGPT founder Sam Altman launched officially, it is generating almost as much skepticism as support. With the goal of combating the spread of AI-driven online robots, the project entails scanning people’s irises in exchange for a digital ID and, in some countries, 25 free WLD tokens. To put it another way, this is all working towards a future digital economy where all participants have proven their “humanness,” and there are no bots.

Last week, Sam Altman tweeted an unverified claim that the project was onboarding users at an average of “eight users per second”. Whether those claims are true or not, the fact is that Worldcoin is enjoying the limelight. The project has already deployed “orb” devices worldwide to scan people’s irises and issue them a blockchain-based digital identity. However, it has prompted privacy concerns, as it is unclear how far something as private as an iris scan may be used.

Ethics, privacy and the possibility of people selling their data unwillingly

Many governments seem to have been caught flatfooted by the hype around the coin. In addition, the project’s growing popularity is expected to be sustained by the promise of free WLD coin to anyone willing to have their irises scanned as part of the initiative. European data protection authorities have already swung to action, France’s National Commission on Informatics and Liberty (CNIL) and the UK Information Commissioner’s Office saying they would query Worldcoin’s data.

Also, there’s the question of ethics around the project. The ethics of selling one’s private data has been called into question by a number of critics. Tools of Humanity, the company behind Worldcoin, however, insists that it does not keep users’ biometric data. Instead, it claims that the data is transformed into what it calls an “IrisCode,” a collection of letters and numbers that cannot be decrypted to reconstruct a person’s iris.

Nonetheless, even while assuring users of the privacy of their iris scans, there are justifiable reasons to be concerned.  Last week, Sam Altman stated, “We think that we need to start experimenting with things so we can figure out what to do,” leaving much to imagination/speculation.

Despite the flood of criticism, tens of thousands of people have rushed to get their irises scanned. In Nairobi, Kenya, for example, a local supermarket chain has set up orbs scanning desks, and hundreds of mostly young people, have been waiting in line to get their irises scanned for WLD tokens since last week. With the rising global economic crisis, driven by rising inflation and high unemployment rates, there is the possibility that many people could be “selling” their iris scans unwillingly- if only to secure a week’s meal.