After Apple’s latest product announced earlier this year, concerns have been raised and are spreading online regarding what it means for the privacy of its users. Privacy around new technologies and the control we may be handing over to tech giants has been a hot topic recently, and Apple’s new feature is decidedly not dousing the fire.
Apple claims the new feature that will roll out on certain devices in the US will automatically scan personal devices in a bid to tackle child abuse.
NeuralHash’s perceptual hashing function creates fingerprints in a different way to traditional cryptographic hashing. Essentially, it can identify imagery without decrypting it unless a threshold is met, and a sequence of checks done. It then reports this to the National Center for Missing and Exploited Children.
The new feature has caused notable tech leaders to raise concerns about what this means for privacy and consumer rights, led by Whatsapp CEO Will Cathcart. He described it as a “step back for people’s privacy all over the world.” Yet Apple has said it’s designed with user privacy in mind and in line with the company’s public commitment to privacy for all users.
The EU and UK governments are taking steps to regulate how the powerful US tech giants store and use consumer data. The EU has an antitrust investigation following complaints in 2019 from Spotify. In the UK, the Digital Markets Unit, part of the Competition and Markets Authority, may be able to suspend, block and reverse decisions by tech giants and issue fines of up to 10% of turnover.
Yet at the same time, Apple has been under government pressure in the US for increased surveillance of encrypted data. Governments and agencies are pressuring all large organisations that have any sort of end-to-end or even partial encryption enabled, stating CSAM and possible terrorist activities as rationale to argue for measures. It’s a confusing message.
Who is really leading the charge here? Whether it is tech giants or governments, it is concerning. Matthew Green, cryptography researcher at John Hopkins, said it could be used to frame innocent people or weaponised, and said it will lead to “bulk surveillance of phones and laptops.” Not all governments and organisations in the world have the benevolent wellbeing of their citizens at heart.
In 2020, there were reports of organisations using phone location data to track and spy on Black Lives Matter protesters, and in Jordan, the World Food Programme collected migrants’ biometric data, requiring refugees to submit it to have access to food in camps.
Privacy campaigners have said that there are dangerous ramifications, and question whether this is a backwards step in digital security. WhatsApp has come under fire for having less robust privacy policies, which came to the fore when it was relaunched back in May, and now Apple is being criticised for infringing privacy.
But where does responsibility for privacy lay?
Smart devices and technology are embedded in our everyday lives, and most consumers trust them because it’s easier to do so. Few of us read the privacy notices. We accept that Alexa, and thus Amazon, knows when we’re home, when we leave the house, when we go to bed, because it makes life convenient. Each of these devices or companies in silo might be OK, but what if Amazon starts telling authorities when you’re home based on your Alexa?
We can’t just blame the tech behemoths – we all need to take responsibility to make a shift towards a less invasive approach. Consumers have to be confident that they are using a platform that genuinely has privacy at its core. Whether that’s through reading the policies or only selecting the data you wish to share, or choosing platforms that allow you to store encrypted data in select locations, it is essential to take responsibility. As more lawsuits and regulations come to the fore, we may find a more cautious approach being taken.
At the same time, governments must not only conduct public consultations into antitrust or bring in measures seemingly about privacy at times of crisis, or when they are worried about competition and market share (which may be the real issue here). There should be more uniform regulation where possible, with collaboration between different countries, especially important with a growth in population migration and multi-residencies. There is a discrepancy in what data sovereignty means for each country, to reign in tech sovereignty. A consistent approach and regulated data flows need to be made a priority.
The speed at which technology is adapting and improving our everyday lives is phenomenal – and alarming. Regulation and investigation need to catch up, and tech giants need to build their products with the fundamental human right of privacy in mind. Unfortunately, the drive for privacy seems to be coming from a place of escaping any fines or reputational damage for the time being.
However, there is a growing movement and demand from consumers for transparency, and the tech giants that fail to deliver on this will fall behind. If organisations don’t change their attitudes towards this highly sensitive topic soon, their users will force them to.