Facial recognition should never be used as a public security tool. What facial recognition excels at is allowing authorities to track not just individuals, but groups, and group movements. Facial recognition tech has been banned for government use by San Francisco and Oakland in California, and Somerville, Massachusetts. And now, with the No Biometric Barriers to Housing Act, it may be banned for any housing project in the US that receives government funding.
While the Fed can ban the software for public housing where they control the purse strings, facial recognition tech will still be a problem for tenants who are not part of the public housing system. There are no restrictions on landlords to not make use of the software, and certainly, many of them see it as a useful security tool. It goes along with the suite of smart home tech that many landlords install. These include smart locks and surveillance, as well as facial recognition. In a Brooklyn building, tenants had to take landlords to court just to be issued proper keys.
While individual homeowners are free to install whatever madness they’d like in their own homes, it’s not reasonable for landlords of rental properties to do the same, especially in places where there is a housing shortage. Tenants who need housing will increasingly end up with homes where they are being monitored, and the more comfortable they get with that reality, the more monitored they will be. Giving up freedoms for convenience gets easier and easier until suddenly there are none left to be had.
It’s bad enough that facial recognition tech has been deployed by border security, transportation security, and some police departments, New Orleans and New York among them. But living a life that is under constant surveillance limits liberty and personal agency. This is not something only bad actors should be worried about, but all of us. To live freely means to live free from every agency of authority or curiosity knowing what you’re doing all the time, or your landlord tracking you from your key fob. The information that’s gathered doesn’t go away.
The information is also often entirely inaccurate. It has a high rate of false positives because the software compares faces to databases of known criminals. For the software to really be effective, the entire system of ID photos would have to be integrated, from licenses and school IDs to passports and workplace security clearances. This would not only be an insane amount of private industry and government collaboration, but it would be a huge violation of civil liberties and individual rights.
The software is now used to detect illegal activity, but would it be any surprise to find that it would be used to monitor immoral activity as well, and used to shame people for their behaviour? Or used by insurance companies to monitor your grocery purchases see if you’re really keeping to that diet? By employers to track what you’re doing on your lunch break? There’s no end to the activities that people can use to judge others, and surveillance tech leaves the door open to all of that.
The implications for the software on individuals’ personal lives is one thing, but it goes deeper. It is currently being used in China’s Xinjiang province to track the Uyghur minority. Beijing is also getting on board with setting up surveillance in public housing, exactly the thing that this bill would prevent. The constant surveillance does more to cow people into submitting to state wishes than anything else, because it creates a fear, a worry that what you do could be seen. So if there’s something you’re doing that the state doesn’t want you to do, like coming together with other believers for a worship service, you might decide not to risk it.
The tracking of groups is a huge civil rights concern. If, for example, and I’m just spitballing here, a group of women wanted to get together to talk about women’s rights and the necessity of women’s only spaces, and a meeting of this type was frowned upon, facial recognition would be a great tool for tracking and monitoring women. The system could be trained to look for large groups of women as they funnelled through a city transit system, arriving at their predetermined location.
If this software had been implemented in say the deep south during the Civil Rights Era, when African American men and women were engaged in secret meetings to organize against Jim Crow laws, segregation, and discrimination, these brave people may have felt unable to contribute in fear of their own safety. And given that their safety was already in danger, imagine how much worse it would have been if authoritarian, segregationist whites could have actually tracked them. What a nightmare.
The bill to prevent facial recognition in public housing is necessary. Tenants who live in subsidized housing should not be subject to personal privacy violations. But neither should anyone else. The Fed may not have the power to implement a ban on facial recognition tech in private spaces. But anyone who rents housing to the public should not be allowed to violate tenants’ privacy through facial recognition tech, and where facial recognition software is being used, it should be revealed as such. People should have the right to know when they are being monitored in this invasive way. Will it be at hospital emergency rooms? Homeless shelters? Hotels? Banks? Movie theatres?
Individuals sign on to being tracked all the time, using their faces to open locks or phones. But we cannot allow it into our personal spaces, and especially not in our homes. The tech is moving faster than civil liberties laws, and it’s up to us to keep our freedoms in place.