What is wrong with artificial intelligence in 2016?

Auto-driving cars, man-to-machine wars, and the drama “Western World” depicting robots’ self-consciousness... In 2016, artificial intelligence and related fields were hotly debated by industry, academia, and even the whole society. At the same time, people’s concerns about artificial intelligence have never faded: when and in what way does artificial intelligence cause problems?

Near the end of the year, a number of artificial intelligence experts analyzed the typical “failure” cases of artificial intelligence in 2016, and believed that these errors were concentrated in the machine learning and task execution stages of artificial intelligence systems.

[Discrimination - Are All Human's Faults?]

A number of AI experts believe that racism and racial discrimination are the main problems in current artificial intelligence systems, which may be related to the ethnicity of the system's R&D designer.

In 2016, multiple courts in the United States began to use artificial intelligence systems to predict the probability of a criminal committing another crime as a basis for judges to grant criminals probation. However, the media investigated the use of a system called the “Minority Report” prediction system used by Florida courts. It was found that blacks were predicted to be twice as likely to be criminal again as other ethnic groups, and blacks were predicted to be violent again. The odds are 77% higher than other ethnic groups. The data shows that the accuracy of this artificial intelligence system is only 20%. According to media commentary, the only thing the so-called crime prediction system can judge is only the race of the surveyed people.

In 2016, a number of companies in the United States jointly launched the "World's First Artificial Intelligence Pageant Contest." Participants uploaded photos on their websites and used artificial intelligence algorithms to "precisely" evaluate the contestants' beauty. However, the winners of this game are white. The director of the Cybersecurity Lab at the University of Louisville in the United States, Jan Polischi, pointed out that it is clear that this sample of AI learning and training is not enough, and the beauty is limited to a fixed model.

In order to open up the young people's market, Microsoft launched the artificial intelligence chat robot Tay on social networking Twitter this spring. However, less than a day on the line, in the words of netizens, Tay became a “monster who loved Hitler and insulted women”, and Microsoft immediately put Tay offline.

Nintendo's "Pokemon" game became popular around the world in July. Players soon discovered that there was very little elf in the black community. According to USA Today, in the Los Angeles area, the average white community has an average of 55 "elf station". The average number of black communities is less than 19; similar situations occur in Detroit, Miami and Chicago. Black players are difficult to find in their own doorway. Participate in the game. Game developers admit that this augmented reality smart game is based on a white player-based map system, and developers don't spend much time on the black community.

Truck Scale

Floor scale, small weighbridge, used for farm, household, industrial, pasture, etc., with fence. Loadmeter, set on the ground, large volume, suitable for measuring car weight.Warehouse Platform Scale,Industrial Scale,Truck Scale,Loadmeter,Weighing Scale Electronic Warehouse Platform Scale

Floor scale ,Truck scale ,weighing scale electronic warehouse platform scale ,Warehouse platform scale, industrial scale, loadmeter

Ningbo Santwell Imp & Exp Co.,Ltd , https://www.santwell.com