Artificial intelligence should be used carefully in the judicial field
December 07, 2022 09:51 Source: "China Social Sciences" December 7, 2022 Total Issue 2546 Author: Practice Leisure/Compilation

  Bet365 lotto review  The official website of the University of Kangcatia, Canada recently reported,The school scholar Neha Chugh research believes,Applying artificial intelligence to Canada's criminal justice system is early,Because there are not enough measures to deal with the problem of prejudice of machines and humans。The latest issue of "IEEE Technology and Social Magazine" publishes Chuge's new research "Risk Assessment Tool torture: The Go and Correction Bet365 lotto review of the Artificial Intelligence System"。

Now,Artificial intelligence technology is widely used in various fields,Including a criminal justice system。In the risk assessment of the pre -trial program in the United States,Algorithms and data are increasingly used to evaluate the risk of the defendant's crime。Supporters think,Artificial intelligence can avoid police、The inherent prejudice of criminal judicial officers such as judges or prosecutors。Chu Ge questioned this statement。She proposed in the paper,Although the artificial intelligence risk assessment has not been applied to the Canadian court,But some dangerous signals have appeared,Some problems urgently bet365 best casino games need to solve the judicial system。She said,In a criminal judicial system,The proportion of the aboriginal defendants is too high,As a result, it is more likely to be affected by data -driven tool defects。

Chuge thinks,Risk assessment tools are generally problems,For example, Ewert v. Canada。Jeffrey

Ewert) is "Meeti",That is, the parents are the Aboriginal people in Canada,The other party is European descent,He was sentenced to life imprisonment for attempted murder and murder。Evat claims that the psychological risk assessment system of the Canadian Disciplinary Agency has cultural prejudice against the Aboriginal people,Make the indigenous people's sentence far exceed the non -aboriginal Bet365 lotto review people,and the condition of serving more worse。Last,Evat's appeal in the Supreme Court successfully。

Chu Ge said,The existence of prejudice and discrimination misleading the assessment tool,So how do you ensure that the algorithm finally gets the right result? The use of artificial intelligence for risk assessment is just transferring prejudice from humans to algorithms created by humans,The database with deviations will only allow the algorithm to get the results of the deviation。Laws and regulations provide a certain space for the work of the judge,and retain room for considering personal history and special circumstances。One of the main problems of excessive Bet365 lotto review dependence on artificial intelligence for risk assessment,In lack of subjective judgment and respect。Chuge emphasized,She does not oppose the use of artificial intelligence in the court system,But it still needs further research。such as,You need to analyze the information used in the algorithm,and the impact of people's prejudice on results。

 (Practicing Zezhi Leisure/Compilation) 

Editor in charge: Chen Jing
QR code icon 2.jpg
Key recommendation
bet365 live casino games The latest article
Graphics
bet365 live casino games

Friendship link:

Website filing number: Jinggong.com Anxie 11010502030146 Ministry of Industry and Information Technology:

All rights reserved by China Social Sciences Magazine shall not be reprinted and used without permission

General Editor Email: zzszbj@126.com This website bet365 live casino games Contact information: 010-85886809 Address: 11-12, Building 1, Building 1, No. 15, Guanghua Road, Chaoyang District, Beijing: 100026