About the issue of gender equality has been concentrated in the field of cultural symbols and offline society,Focus on the materialization of the gender characteristics of women's society、stereotype and stigma,Women suffer discrimination and unfair treatment in the employment market and marriage families。With the arrival of the intelligent technology society,Some researchers discovered,Gender discrimination starts to overflow the above category,Face recognition、Search recommendation、Smart Recruitment、Financial credit and e -commerce big data exist in extensive algorithm gender discrimination cases,Gender inequality has penetrated into various algorithm systems,Some deep -rooted social gender concepts have also evolved into an inherent mechanism of algorithm,Not only amplifying people's discrimination,And solidified into a systematic and standardized machine order,It may also cause mutation,Causes new inequality。For this,2020,United Nations -China Gender Fund, CGF) In its ninth bidding project, a research funding for "promoting the gender equality of artificial intelligence algorithm",The research and intervention work in related fields will be carried out by the UN Women's Office and the domestic bidding institutions in China,to make up for the gender bet365 live casino games blind spots in the development of AI,It shows the urgency of this "social -technology" problem。
2019,British writer and feminist social activist Caroline Criado Prees published "The Gender Prejudices of Data in the World in the World for Men" ( "(Invisible Women: Exposing Data Bias in a World Designed for Men)。In the book,She pointed out with inspiration,The existing big data is actually not complete,Lack of 50%of the world -women's data。Our world serious lack of gender -based classification data,Because this is a world based on male data。Men are "ordinary people",Is the "default settings" of the world。The arrangements of economic activities and daily life are replaced by men's standards for the entire human life,Each field is full of "no present" for women (data),leading to permanentness to women、Systemic discrimination,Let them live in a system and material environment that did not consider their needs,Facing a lot of unsatisfactory services、awkward products and bad experiences。
and,This "Gender Data Gap" (Gender Data Gap) is not only available in the digital age,but the result of the continuation of hidden social specifications。for a long time,The data has not been collected and used by gender classification。For example, theater、Classroom、The air conditioner in the office often sets the temperature according to the body's body sensing,Women generally feel too cold; mobile phone design size is too large,Not suitable for women to use it with one hand; nurse protective clothing during resistance is very bet365 live casino games unfinished,Because the designer is based on the male figure; the female driver is more likely to die in a car accident,The dummy used for the seat and safety test of the car is designed by the male physique,It does not meet the muscle quality distribution of women's bodies、spine bone spacing、Bone density and physical balance -because the driver is default to be a career for men。These problems are not decorated with products or space into pink、Shin 1 can be solved,Its root is that it does not include women's data into design and decision -making systems,So I couldn't really take care of their needs。Because in a world that is mainly for men and is built by men,The population of the other half is systematically ignored。
When the problem of lack of female data, enter the digital technology and Internet era,and when humans are more and more dependent on data and algorithms,The technical field is also full of "not seen women",They are still "other" and "second sex"。Perez discovered that voice recognition software's recognition of female voices is much lower than that of male voice,When the voice instruction system of the car cannot understand the language of the female driver,They are very dangerous; the translation software will default to the female doctor、The female doctor is translated into a male title; women are more likely to produce motion sickness when using VR。Gender prejudice in technology products is largely because the data and algorithm behind the data and algorithms behind,Technology and enterprises may not be bet365 live casino games malicious or intentional,Just the product of the traditional way of thinking,But it distorted the data that should be objectively dominated by the production and life,thus affecting all aspects of women's life。This is not simply improving the employment ratio of women in the IT industry、Moles that can be solved by increasing the number of female programmers,Not just a set of algorithm moralization、Transparent and explained that it can be solved with explanations,The focus is that "raw materials" itself has defects。
The arrival of intelligent society and algorithm technology,Make the formation and evolution path of the social structure extremely clear,Structural analysis can no longer rely on the abstract inference and interpretation of "invisible and unable to touch",The interoperability relationship between the structure between the structure and the operation can be accurately outline and tracked in the machine world。Which algorithm logic may cause special action practice and institutional consequences? What kind of algorithms have been trained?,Compared to the previous intelligent era,It is just because prejudice and discrimination have become explicit through the technical system、Observation object,Actively identify and use technical reverse corrections that they may become easier。Recently emerging value -based data -based data ecological canvas methods,Data exchanges between different data and data between different data as much as possible,To determine the specific flow direction of personal data。
bet365 Play online games In the field of cutting -edge technology,Relying on philosophical criticism to confront and reverse extreme rational "dataism" rule order seems to be very small,May wish to change "confrontation" technology into "domestication" technology,Start from the data source of the rich algorithm automation system,Explore as much as possible、Supplement and enter the same "top half of the sky" female data。These data include not only the physiological characteristics of women known、behavioral habits and equipment usage mode,It contains a lot of potential psychology、Female elements at the spiritual and cultural level。The first is to prevent the original gender inequality of social and cultural order is written into the machine decision process,So engraving、Inheritance、Reinforcement old inequality,New inequality,The consequences of the time will be more difficult to shake。It is not only related to the design ideas and experience of certain technology products,It will also affect the formation and supply of public policy。The second is to overcome people made by compulsory technical forces、Subjective inequality,In turn to reverse and correct the status quo。
But this idea still needs to solve two problems。First,In the "Personal Information Protection Law" and "Data Security Law", they are included in the legislative plan of the National People's Congress for the National People's Congress,China's data policy begins to stricter context,How to balance the contradiction between expanding female Bet365 lotto review data collection and personal privacy protection? certainly,Treatment of this problem without gender,All should be reasonably developed under the premise of strict desensitization and strengthening protection,Release data positive role in improving women's living environment。That is, no gender when protecting,Expand the coverage of women during development and collection and decision -making。Next,What is gender equality and there is no consensus,The public’s views and feelings are subjective and different from person to person,It is also necessary to prevent intelligent technology from defining equality and monopolizing its right to interpret,Form new hegemony。Therefore,When involving social objective fair issues involved in distribution,Do not list women alone as a category,Mandatory power of technology,and in some areas that are suitable for maintenance,Training machine is fully sensitive to gender,Give the technology "women's confession"。This will be brought to Honggou,Acknowledge the different,It is required to be very refined、Comprehensive algorithm governance。
Brief words,Female data is also "top half of the sky"。The purpose of improving the data source,It is to guide the machine's "hard rules" to evolve in the direction of humanity and gender friendship,The people we are in — the interactive environment and the world of life have been transformed more equally、tolerance and "soft",Let artificial intelligence empower women。This is by no means an absolute equality in the division of labor and bet365 Play online games gender culture of gender.,On the contrary,This equality is just based on the full recognition of the difference、Respect and maintenance。
(This article is the Shanghai Municipal Social Science Planning Youth Project "Post -Extravasion Era Information Platform Practitioner Ethical Dilemma and Response Strategy Research" (2020Exw002) phased results)
(Author Unit: Department of Culture Research from Shanghai University、Shanghai University China Contemporary Cultural Research Center)
Friendship link: Official website of the Chinese Academy of Social Sciences |
Website filing number: Jinggong.com An Bei 11010502030146 Ministry of Industry and Information Technology:
All rights reserved by China Social Sciences Magazine shall not be reprinted and used without permission
General Editor Email: zzszbj@126.com This website Contact: 010-85886809 Address: Building 11-12, Building 1, Building, No. 15, Guanghua Road, Chaoyang District, Beijing: 100026
>