Hey, there! Log in / Register

Boston City Council votes to ban facial-surveillance technology

The City Council today voted unanimously to bar police and other city agencies from using facial-surveillance software except for specific criminal investigations - and even then only if the data is not generated by city-owned cameras.

The proposal, which would also prohibit the city from buying data from companies that might use the technology, now goes to the mayor for his consideration. The proposal exempts systems, such as on phones, that use facial recognition solely for user authentication.

BPD says it does not currently use facial-surveillance technology because it is unreliable. A planned upgrade to the camera network BPD has in place around the city would include a facial-recognition module, but BPD has said it would ensure that is turned off at all times.

City Councilor Ricardo Arroyo (Hyde Park, Roslindale, Mattapan), who sponsored the proposal, says the measure would ensure Bostonians are not subject to misidentification or systemic racism - for example, the fact that some camera systems are not designed to deal well with darker-skinned faces.

In response to a question from Councilor Frank Baker (Dorchester), Arroyo said the proposal would not prohibit police from collecting photos from existing camera systems to find specific suspects, such as after the Marathon bombings.

"Let's not live in a society where we are constantly surveilling each other's faces," added Councilor Kenzie Bok (Back Bay, Beacon Hill, Fenway, Mission Hill).

Neighborhoods: 
AttachmentSize
PDF icon Proposed city ordinance315.17 KB


Ad:


Like the job UHub is doing? Consider a contribution. Thanks!

Comments

I don't care if you're an anarchist, conservative, libertarian, socialist or moderate -- this is a WIN for all against unwarranted government surveillance!

up
Voting closed 0

How about a city agency independent from the police makes sure this facial-recognition module isn't installed at all?

What does it mean that police can collect photos from existing camera systems? Does that mean a human has to do the comparisons to match the camera images with photos linked to people's names?

Of course we're all wearing masks now, so none if it will work anyway.

up
Voting closed 0

See this story about how too many facial recognition systems are bad at identifying anyone but white males, which can lead to horrific situations.

up
Voting closed 0

How horrific is it, really? It's not like a T1000 is out terminating people based on facial recognition software matches.

up
Voting closed 0

You're right, false arrest isn't a big deal. /s

up
Voting closed 0

How horrific is it, really?

Did you even read the article? The guy got arrested and thrown in jail for 30 hours, even after the police recognized that the facial recognition software had returned a false positive.

Would you like to give that a whirl so you can rank its horrificness? For some people, getting arrested could be a death sentence. Or it could cause them to lose their job. Are you up for that? Is that acceptable as "collateral damage"?

up
Voting closed 0

For some people, getting arrested could be a death sentence.

Please elaborate how getting arrested is a death sentence.

up
Voting closed 0

That's not a "death sentence". It's a misuse of language. Describing something as a death sentence suggests that you know there is a super high likelihood of death.

up
Voting closed 1

Are you unfamiliar with the English language? I used the word "could" to indicate that there was an unknown, but greater than zero, likelihood of death.

up
Voting closed 1

Just give up. There exists a non-zero chance I will get bombed by ISIS when I go to the supermarket this weekend. That doesn't make my planned trip to the supermarket a "death sentence", using our vernacular. Inserting the would "could" doesn't fix it, either.

Either way, trying to prove a point by citing outlier cases doesn't help anybody. It only confuses things. There are valid concerns over privacy and mass surveillance with facial recognition. There are also valid concerns over excessive use of force by police. Those issues are separate.

up
Voting closed 0

Whatever you say.

up
Voting closed 1

You don't have to live like a refugee.

up
Voting closed 0

A planned upgrade to the camera network BPD has in place around the city would include a facial-recognition module, but BPD has said it would ensure that is turned off at all times

I'd prefer not to have to take BPD on its word. Let's hope this ban reshapes the planned upgrade rollout.

up
Voting closed 0

"A planned upgrade to the camera network BPD has in place around the city would include a facial-recognition module, but BPD has said it would ensure that is turned off at all times."

Except when it isn't.

Did they not get the memo that no one trusts the police?

up
Voting closed 0

Now defund cops and the Licensing Board while you're on this progressive kick.

up
Voting closed 0

This headline is conveniently not reported. Mob attacks Boston Cops trying to stop a felon with a gun.
https://boston.cbslocal.com/2020/06/24/boston-police-arrest-hostile-crow...

up
Voting closed 0

I'm seeing this one a lot lately.

"Why is this article about this topic and not about some other topic?!?!?!"

or

"Why is no one talking about X in this article about Y?!?!?!"

Because "liberal media bias" or something.

up
Voting closed 0

And thus we see what happens when people's trust of these alleged "protectors" has dropped to zero.

up
Voting closed 0

Great. But why stop there, what about all the other evidence evaluation techniques that are rife with error?

up
Voting closed 0

facial recognition should also have no role in civil matters such as those involving sexual harassment.

up
Voting closed 1

"BPD says it does not currently use facial-surveillance technology because it is unreliable. A planned upgrade to the camera network BPD has in place around the city would include a facial-recognition module, but BPD has said it would ensure that is turned off at all times."

Prediction: within 5 years a scandal will break, that—

  • The police left this turned on the whole time;
  • They were regularly using it behind the scenes;
  • They were engaging in "parallel construction" so they could hide it from the courts;
  • Numerous cases get suddenly dropped without explanation, when they couldn't make a strong enough fake case without tipping their hand to the courts; and finally
  • This all gets leaked by accident or by a whistleblower.

Mark my words, this will happen.

up
Voting closed 1