A growing problem for India
Facial recognition technologies can make mistakes
Previous
Next
It’s not just in Hyderabad. If you live in any of India’s big cities and commute to work every day, there’s a high chance that you’re being watched. In July 2021, for example, surveillance systems were installed in more than 30 railway stations in Gujarat and Maharashtra. Over seven million people travel on Mumbai’s suburban train network every day, and now all of them are caught on camera.
In the past five years, the government has funded at least forty video surveillance projects, and has issued an open tender for an integrated nationwide central system called the National Automated Facial Recognition System.
“What’s clear from all these projects is that this government is putting surveillance first,” says Divij Joshi, a lawyer and researcher who founded the AI Observatory, an independent research organisation. “Often security becomes the go-to excuse for these things, but it is so weak and widely defined with such few safeguards that it could easily turn into moral policing of interfaith couples or activists, or whomever they want.”
And there’s an additional problem involved: the technology isn’t perfect — not by a long way — and often identifies faces incorrectly. That means you could face arrest as a result of “evidence” caught on camera, when the real suspect is just someone who shares a few of your facial features.
It’s time to demand full transparency from the government. If surveillance projects are to continue, there is a need for an open conversation, and for legislation to ensure that data is used only when strictly necessary, in accordance with international standards.
Privacy is a basic human right, and we must defend it before it’s too late.