Automated technology frequently contains inherent biases.

The use of algorithms for moderation creates a clear risk that the screening systems will disproportionately block content relating to or posted by minority ethnic or religious groups.

#onlinesafetybill #censorship #freespeech #codedbias

Last updated 1 year ago

Decisions on illegality won't be made by the courts, but by private providers with broad discretion.

Vast amounts of content are posted every day, so picking out 'illegal' content will be done by faulty algorithms.

#onlinesafetybill #censorship #freespeech #codedbias

Last updated 1 year ago

Civil society groups say the use of sensitive metrics is exacerbating discrimination in policing.

See for yourself with this predictive policing tool from Fair Trials.

13/15

fairtrials.org/predictive-poli

#policing #precrime #codedbias #predictivepolicing

Last updated 2 years ago

UK have algorithms analysing huge amounts of information, using discriminatory factors like ‘cramped houses’ and ‘jobs with high turnover’ to predict someone's likelihood to commit a crime.

See Liberty's factsheet on

12/15

libertyhumanrights.org.uk/fund

#police #predictivepolicing #policing #precrime #codedbias

Last updated 2 years ago

Wanda Whitney · @bibliotecaria
849 followers · 695 posts · Server blacktwitter.io

@dltj

Big sigh. So many stories out there like this. Check out and for their work. We need more accountability.

ajl.org/

#ethicalai #codedbias #algorithmicjusticeleague

Last updated 2 years ago

Your personal data is used to profile you, making it easier for biased algorithms to decide over job applications, loans, housing and more. Data discrimination has real consequences for marginalised groups the most. We ask councils to fight this practice.

openrightsgroup.org/publicatio

#datadiscrimination #dataprotection #gdpr #codedbias

Last updated 2 years ago

Kintsugi LAB · @klab
19 followers · 5 posts · Server mastodon.kintsugi-lab.com

Los algoritmos reflejan el pasado, el más oscuro.
Si las máquinas aprenden automáticamente con estos datos escritos por el poder del señor blanco siliconniano, lo que tenemos es una reproducción a enorme escala de este sistema injusto y cruel. “Sabemos que la gente puede ser injusta, pero creemos que los algoritmos no, y ahí está el problema”
🤔 ¿Es posible una ?
 medium.com/@erikairusta/quién-

#feministai #datafeminism #codedbias #criticalaiethics

Last updated 2 years ago

Kintsugi LAB · @klab
19 followers · 5 posts · Server mastodon.kintsugi-lab.com

Los algoritmos reflejan el pasado, el más oscuro.
Si las máquinas aprenden automáticamente con estos datos escritos por el poder del señor blanco siliconniano, lo que tenemos es una reproducción a enorme escala de este sistema injusto y cruel. “Sabemos que la gente puede ser injusta, pero creemos que los algoritmos no, y ahí está el problema”

🤔 ¿Es posible una ?

medium.com/@erikairusta/quién-

#feministai #datafeminism #codedbias #criticalaiethics

Last updated 2 years ago

sun writer · @Heliograph
145 followers · 970 posts · Server mastodon.au
Wanda Whitney · @bibliotecaria
741 followers · 440 posts · Server blacktwitter.io

Also just watched on Netflix yesterday. Pretty scary use of AI/facial recognition. 😱

#codedbias

Last updated 2 years ago

Yarden Laifenfeld · @yarden
130 followers · 216 posts · Server hachyderm.io

I’m 30 minutes into (the on ). Everything they are saying is against capitalism and certain politics, and very little is actually about AI.
They somehow attribute so many bad things happening around the world to the use of AI, saying that AI is just replicating the past. But I feel like if the past taught us anything it’s that bad things happened way before modern technology…
Why blame AI, which can do as much good as it can do bad?

#codedbias #netflix #documentary #ai

Last updated 2 years ago

⛈️ rain · @rain
6 followers · 63 posts · Server systemli.social

Well put by @abebab:

"Although the choices of those with privilege have created these systems, for some reason it seems to be the job of the marginalized to “fix” them. In response to ChatGPT’s racist and misogynist output, OpenAI CEO Sam Altman appealed to the community of users to help improve the model."

wired.com/story/large-language

#chatgpt #openai #codedbias

Last updated 2 years ago

The age-verification system being tested by Instagram is 'less accurate for female faces and people with darker skin.' For under-24s 'its estimates can be off by up to 2.5 years.'

The permits any tech to check your age. From social media to search engines, what you can access online will be decided by biased AI running guesstimates.

Bad law creates injustice.

theverge.com/2022/6/23/2317975

#onlinesafetybill #codedbias #blockthebill #privacy #freedomofexpression

Last updated 2 years ago

SaanaAve · @saanaave
476 followers · 109 posts · Server mastodontti.fi