Holy #surveillance hell, Batman.

Let me get this straight:

First, they feed your video, which is already stored in their cloud, into an #AI transformer to write descriptions.

Then they feed your descriptions into a pattern learning system (ML, maybe?) to figure out your patterns and habits.

All of this is stored in the cloud. So they not only have your video, but a narrative about your habits, ready to be exfiltrated, monetized, and shared with law enforcement.

#ai #enshittification#RingCamera

https://www.theregister.com/2025/06/25/amazons_ring_ai_video_description/

🔥 Just dropped: a fresh #EDRigram in your inbox!

#Surveillance is rising, fascism is spreading, democracy is gasping. How do we resist? We’ve got plans. Big ones:

⛰️ EDRi’s strategy for 2025–2030 is locked and loaded
👁️‍🗨️ Our position paper which calls for a BAN on commercial #spyware
📌 #PrivacyCamp is back! September 30: save the date for the 2025 edition and get hyped

Read about this (and much more) in our latest edition. Dive in ⤵️
https://edri.org/our-work/edri-gram-25-june-2025/

"At their heart, these technologies infringe human rights."

Last week @sianberry tabled an amendment to the UK Crime and Policing Bill that would prohibit the use and deployment of dangerous 'crime-predicting' police tech.

These systems will subject overpoliced communities to more surveillance. More discrimination. More injustice.

Sign the petition to BAN it ➡️ https://you.38degrees.org.uk/petitions/ban-crime-predicting-police-tech

#SafetyNotSurveillance #surveillance #precrime #predictivepolicing #police #policing #ukpolitics #ukpol

Video of Siân Berry (Green MP) speaking in the UK House of Commons about an amendment to the Crime and Policing Bill that "would prohibit the deployment and use of certain forms of “predictive” policing technologies, particularly those that rely on automated decision-making, profiling and artificial intelligence, to assess the likelihood that individuals or groups will commit criminal offences." "Such technologies, however cleverly sold, will always need to be built on existing, flawed police data, or data from other flawed and biased public and private sources. That means that communities that have historically been over-policed will be more likely to be identified as being “at risk” of future criminal behaviour.... At their heart they infringe human rights, including the right to privacy and the right to be presumed innocent."
Video of Siân Berry (Green MP) speaking in the UK House of Commons about an amendment to the Crime and Policing Bill that "would prohibit the deployment and use of certain forms of “predictive” policing technologies, particularly those that rely on automated decision-making, profiling and artificial intelligence, to assess the likelihood that individuals or groups will commit criminal offences." "Such technologies, however cleverly sold, will always need to be built on existing, flawed police data, or data from other flawed and biased public and private sources. That means that communities that have historically been over-policed will be more likely to be identified as being “at risk” of future criminal behaviour.... At their heart they infringe human rights, including the right to privacy and the right to be presumed innocent."