In recent years,With the development of technology and society,Autonomous driving has gradually implemented。Autonomous driving relying on artificial intelligence and networking technology,It will build an efficient and coordinated vehicle network,Real -time monitoring of road conditions through satellite navigation,Implementing cars and roads、Car and car、Data sharing and interaction between cars and people,Jiefang driver's hands and choose the best route,to help relieve traffic pressure、Reduce commute time; optimize control of cruise,Can effectively improve the efficiency of fuel utilization,It is conducive to saving resources;,Autonomous driving can also avoid drunk driving、Traffic accidents caused by fatigue driving。Based on the above -mentioned many advantages,Autonomous driving is more valued by people。But,Autonomous driving is a "double -edged sword",While bringing a lot of convenience to people,There are also some shortcomings。For example,When encountering an emergency,Is autonomous Bet365 lotto review driving better than artificial driving? This is especially reflected in the "tram problem"。
"Tram Difficulty" is the ideological experiment in the ethics field。The general content is: Five people are tied to the tram track,An out of control tram drives at them,is about to crush them to them,At this time, you can pull a pull rod,Make the tram off on another track,But one person is tied to another track。Under this situation,How to choose? The utilitarian believes that moral decisions should follow the greatest happiness that pursue the biggest most people,The principle of maximum benefits,May choose to pull down the lever,Save five people and sacrifice one person,And thinks that the victim is a loss of loss,It is acceptable to the result。Critic of utilitarianism believes,A person's life cannot be compared and measured,If you think "not killing" is a sacred and inviolable moral rule,So as long as you murder your life,Even to save more people,That is also wrong。In this case,If pulling down the lever,So it should be the responsibility of the person's death。This is caught in the dilemma of morality。
Some researchers have carried out research on moral problems from the perspective of neural psychology,and combined with functional nuclear magnetic resonance technology to examine the individual's brain activation mode bet365 best casino games when solving such problems。Discover,Individuals will be activated when individuals solve moral problems,Inner frontal cortex of the abdomen has an important role in the emotional experience produced by the regulating individual when completing the moral processing problem,The orbital cortex is also closely related to moral judgment。When the individual makes judgments on some moral problems,The area responsible for emotional processing in the brain is more active than the area responsible for cognitive activities,This indicates,Individuals face two difficult choices like "tram problems",It will be affected by emotional thinking and rational thinking at the same time。Joshua Greene pointed out,When the individual faces the moral dilemma to make emotional driving moral decisions,Obilus cortex and inner frontal forehead cortex will be activated,and the frontal forehead cortex of the back of the back will weaken the activation of the orbital cortex and inner frontal premier cortex。Compared with the choice of moralism,higher skin electrical activation level of individual skin when making utilitarian choices。In short,The problem of moral dilemma is a complex advanced cognitive activity,During individuals to perform this special cognitive activity, multiple brain areas will be activated。
bet365 live casino games With the rapid development of autonomous driving,Its penetration rate will also become higher and higher,Traffic accidents that follow it are increasingly attracting people's attention。2018,An accident in the world in Arizona, the world in Arizona, the death of the world of autonomous vehicles caused by the death of people.,This is caused by the error recognition of the autonomous driving system and did not take any avoidance measures。This incident triggered people's trust in autonomous vehicles。When facing emergency traffic problems,Human will try to make the best decision,and bear the consequences of decision -making。So,When the autonomous driving system is facing similar situations,What principles should I make based on? How to identify the responsible subject of the accident?。
Ryosuke Yokoi and others from the perspective of traffic psychology,Study on the "tram problem" faced in autonomous driving。They use 2 × 2 experimental design,Invite 128 volunteers to complete the moral dilemma experiment,Exploring the impact of moral beliefs (the same/different) and driving subject (autonomous driving/manual driving) on autonomous driving trust。Discover,104 people have made utilitarian choices (that is, save five people by sacrificing one person),Only 24 people have made ethical choices; whether it is autonomous Bet365 app download driving or manual driving,As long as the driving subject is the same as its moral choice,Individuals will trust it。This indicates,Common moral belief is an important factor that affects people's trust in autonomous driving。From this,We can get the following inspiration: Policy makers and artificial intelligence technology developers can learn the moral beliefs and moral rules of humanity through training the autonomous driving system,Then promote the self -driving technology to better get the trust in the people、benefit the people。
Giulia Benvegnù and others through virtual reality technology,Examine the individual under the conditions of manual driving and autonomous driving,Emotional response when facing the problem of moral dilemma、The relationship between the sense of responsibility and the acceptability of moral behavior。Discover,When facing moral difficulties,Compared with artificial driving,Autonomous driving will reduce the level of passive emotional experience and moral responsibility level。In other words,At this time, manual driving will bring a higher negative emotional experience and a stronger moral responsibility。But when not facing a moral dilemma,Compared with autonomous driving,The sense of pleasure experienced by individuals under artificial driving bet365 Play online games conditions is stronger。This also indicates that when studying the decision to apply to autonomous vehicles, you need to consider the emotional experience of the individual。
"Tram Difficulty" in the field of autonomous driving,You can start from the following aspects。First,From the technical level,We must continuously promote the optimization and upgrade of the autonomous driving system,Strengthen the development of algorithms and system moral decisions,Let the artificial intelligence system learn the emotional experience that occurs when human beings face moral dilemma。2,From the social level,To formulate and improve relevant laws and regulations,Clarify the responsibilities of all parties,Let the autonomous driving develop in a healthy direction under the constraints of laws and regulations。Last,From a personal perspective,When we enjoy the convenience brought by autonomous driving,Still need to establish a firm safety awareness,Consciously abide by traffic rules,Abandoning the dangerous concepts such as "vehicles will avoid pedestrians"。
bet365 Play online games
(Author Unit: School of Education and Science, Yangzhou University)
Friendship link: Official website of the Chinese Academy of Social Sciences |
Website filing number: Jinggong.com Anmi 11010502030146 Ministry of Industry and Information Technology:
All rights reserved by China Social Sciences Magazine shall not be reprinted and used without permission
General Editor Email: zzszbj@126.com This website contact information: 010-85886809 Address: Building 1, Building, No. 15, Guanghua Road, Chaoyang District, Beijing: 100026
>