Anlasslose Massenüberwachung:
Apple begründet Aus für automatischen Foto-Scan
von Leonhard Pitz
#Chatkontrolle #CSAM
"Scanning every user's privately stored iCloud data would create new threat vectors for data thieves to find and exploit," Neuenschwander wrote. "It would also inject the potential for a slippery slope of unintended consequences. Scanning for one type of content, for instance, opens the door for bulk surveillance and could create a desire to search other encrypted messaging systems across content types."
https://www.macrumors.com/2023/09/01/apple-explains-csam-plan-abandoned
@oblomov *gets excited* this is also where to start building the #moderation tools, #blocklist management support, #CSAM reporting, etc
Operazione Narsil: Interpol smantella un'enorme rete di materiale di abusi sui bambini
L’organizzazione #internazionale di polizia #Interpol ha completato un’#operazione #globale di due anni, nome in codice #Narsil, per chiudere una #rete di siti #web che traggono profitto dalla distribuzione di materiale pedopornografico (#CSAM).
Condividi questo post se hai trovato la news interessante.
#redhotcyber #online #it #web #ai #hacking #privacy #cybersecurity #cybercrime #intelligence #intelligenzaartificiale #informationsecurity #ethicalhacking #dataprotection #cybersecurityawareness #cybersecuritytraining #cybersecuritynews #infosecurity
#internazionale #interpol #operazione #globale #narsil #rete #web #CSAM #redhotcyber #online #it #ai #hacking #privacy #cybersecurity #cybercrime #intelligence #intelligenzaartificiale #informationsecurity #ethicalhacking #dataprotection #CyberSecurityAwareness #cybersecuritytraining #CyberSecurityNews #infosecurity
I shall not name anyone but there was somebody shouting #Defederate
They have reasons.
There can be dangerous material on a server. An obvious example is Child Sexual Abuse Material (#CSAM).
At the heart of the #Fediverse is the idea that we each take responsibility for our actions.
And we all have tools to block and mute accounts and filter according to hashtags. Let's all make sure we can use them.
What tools exist in the #MastoAdmin for checking one's media for #CSAM? I don't want to go all the way down the rabbit hole of "I will burn everything good about the internet out of fear of this" but I would like to be a good fedizen and make sure our instance isn't hosting anything surprising and awful.
Bonus points if those tools can refer me back to the account from which the media came.
It's annoying seeing mastodon users getting defensive about The Stanford Report on Child Safety when what we should be doing is calling on Mastodon gGmbH to add the Directions for Future Improvement features to the roadmap.
I'm not an expert but the stuff on PhotoDNA and CyberTipline support sounds like a win to me.
https://stacks.stanford.edu/file/druid:vb515nd6874/20230724-fediverse-csam-report.pdf
#mastodon #roadmap #fediverse #childsafety #csam #stanford #photodna
#mastodon #roadmap #fediverse #childsafety #CSAM #stanford #photodna
I read this study by the Stanford Internet Observatory, on #CSAM and #Mastodon
Do these researchers know that in the US, if you find such "explicit images", you should freeze and call the police.
You don't have time to question the owner of a server. Otherwise, you risk being an accomplice to the law.
Ganz schlimm, dieses #Fediverse (das ist vermutlich mit "bei Mastodon" gemeint). Das ist nicht nur voller #Mastodonㅤs, sondern auch voller Schweinkram. Das muss sofort detailliert überwacht und gegebenenfalls dichtgemacht werden.
ㅤhttps://www.heise.de/news/Darstellungen-von-Kindesmissbrauch-bei-Mastodon-gefunden-9225873.html
#fediverse #mastodonᅠs #CSAM #zensur #uberwachung
I agree about what y'all said about #CSAM because it really is a problem every where especially mastodon. I just think that mastodon will take care of it better not just because of the moderaters but because the community as a whole can help fight pedophiles. Like Elon musk and other social medias like TikTok, Facebook, etc has done nothing to get rid of them which is why it spreads like wildfire. So it's up to us to fight pedophiles.
Apple avvia Communication Safety, la soluzione contro gli abusi dei minori. Scansionerà le immagini localmente nello smartphone
Lo scorso dicembre, #Apple ha annunciato che stava terminando i lavori sul controverso #strumento di scansione delle foto di #iCloud che l’azienda aveva sviluppato per combattere il materiale pedopornografico (#CSAM) nel rispetto della #privacy degli utenti.
#redhotcyber #informationsecurity #ethicalhacking #dataprotection #hacking #cybersecurity #cybercrime #cybersecurityawareness #cybersecuritytraining #cybersecuritynews #privacy #infosecurity
#apple #strumento #icloud #CSAM #privacy #redhotcyber #informationsecurity #ethicalhacking #dataprotection #hacking #cybersecurity #cybercrime #CyberSecurityAwareness #cybersecuritytraining #CyberSecurityNews #infosecurity
Na prima! Wahrscheinlich verhandelt das BMI da ohnehin maximal doppelzüngig, damit die Faeser ihr SPD- und Koaltionsgesicht wahren kann, obwohl sie eigentlich ein Fan der Chatkontrolle ist. Die Verfassung oder EU-Recht ist denen im BMI doch komplett wumpe.
Hinterher heißt es "Buhuu, wir konnten nichts machen, ist ja EU", wie bei den #Uploadfiltern.
Durchsichtiges Manöver.
#CSAM
Il parlamento tedesco si oppone al modo che la EU sta ideando per scoprire gli abusi sessuali sui minori via Internet obbligando i fornitori di servizi a controllare comunicazioni/email/chat/foto anche su connessioni cifrate: «All experts, including child protector organizations, agree that the EU proposal goes too far and that it would undermine fundamental human rights protected by the EU Constitution.»
https://tutanota.com/blog/posts/germany-against-client-side-scanning-csam
#CSAM #abuse #encryption
Wie ist die Position unserer ÖR #Rundfunk und der Journalisten-Verbände zur #Chatkontrolle ?
Wer #Ende-zu-Ende verschlüsselt macht sich zukünftig in der EU offenbar der Verbreitung von #CSAM verdächtig: Briefgeheimnis, Gerichte oder Unschuldsvermutung waren gestern: Morgen wird Vermutung robust durchgesetzt und blockiert, wer auf Vertraulichkeit besteht?
Soll eine "Vor-Snowden #Totalüberwachung " wieder möglich werden? Kein #Protest von #Journalisten und Verbänden?
#rundfunk #Chatkontrolle #ende #CSAM #totaluberwachung #protest #journalisten
«#Apple has begun scanning your local image files without consent».
https://sneak.berlin/20230115/macos-scans-your-local-files-now/
Note aside: «Law enforcement obtaining data on criminals is not a tragedy. Law enforcement investigating innocent people leads to extreme injustice. You should reject all law enforcement #surveillance attempts, obviously if you are criminal, but especially if you are an innocent».
#Apple #surveillance #privacyMatters #CSAM #macos
1/3
Artificial Intelligence Act #AI_Act
Cyber Resilience Act #CRA
Child Sexual Abuse Material #CSAM
Data Governance Act #DGA
Digital Markets Act #DMA
Digital Operational Resilience Act #DORA
Digital Services Act #DSA
#RechtaufReparatur
#AI_Act #CRA #CSAM #dga #dma #dora #dsa #rechtaufreparatur
-Consultant with federal, state, and local jurisdictions on #CSAM and sexual abuse/exploitation, extremism movements and cults along with working with survivors and helping them come out the other side.
-I read too much, know too much about the grossness and beauty of humanity due to seeing them at their worst and best
#apple abbandona completamente il programma del filtro #CSAM
https://www.melamorsicata.it/2021/08/16/altre-info-sul-sistema-csam-di-apple/
#Apple hat seine Pläne für eine automatische #CSAM-Erkennung via Perceptual Hashing beim Upload von Fotos in die iCloud begraben.
https://www.wired.com/story/apple-photo-scanning-csam-communication-safety-messages/
Die Kehrtwende erfolgte letztlich auch durch den Einwand von Experten aus dem Bereich Datenschutz und Datensicherheit, die auf die Gefahren dieser Praktik hingewiesen haben.
Die Thematik haben wir in einem Pyngu Magazin Artikel im letzten Monat thematisiert:
#pyngunews #StopScanningMe #privacy #CSAM #apple