Authorities in San Francisco, USA, plan to use an artificial intelligence (AI) tool to mitigate the risk of bias when decisions are made regarding the prosecution of potential criminals. The 'bias mitigation tool' is intended to address the racial bias in the legal system and thus avoid having people prosecuted based on this bias. The AI tool will be applied to documentation processed by a prosecutor and will mainly redact information from a police report that could identify a suspect's race (description of race, hair and eye colour, neighbourhoods, names of people if they could indicate the individual's racial background). It will also hide information identifying specific police officers, to avoid bias decisions by prosecutors due to them knowing the officers. Expected to be launched in July, the system will use computer vision algorithms to recognise words and replace them with generic alternatives, such as 'location' and 'officer 2'.
Register Email now for Weekly Promotion Stock
100% free, Unsubscribe any time!Add 1: Room 605 6/F FA YUEN Commercial Building, 75-77 FA YUEN Street, Mongkok KL, HongKong Add 2: Room 405, Building E, MeiDu Building, Gong Shu District, Hangzhou City, Zhejiang Province, China
Whatsapp/Tel: +8618057156223 Tel: 0086 571 86729517 Tel in HK: 00852 66181601
Email: [email protected]