ADMICRO

Read the following passage and choose the best answer (A, B, C, D):
Don’t look now, but artificial intelligence is watching you. Artificial intelligence has tremendous power to enhance spying, and both authoritarian governments and democracies are adopting the technology as a tool of political and social control. Data collected from apps and websites already help optimize ads and social feeds. The same data can also reveal someone’s personal life and political leanings to the authorities. The trend is advancing thanks to smartphones, smart cameras, and more advanced AI. An algorithm developed at Stanford in 2017 claimed to tell from a photograph whether a person is gay. Accurate or not, such a tool creates a new opportunity for persecution. “Take this type of technology, feed it to a citywide CCTV surveillance system, and go to a place like Saudi Arabia where being gay is considered a crime,” says Lisa Talia Moretti, a digital sociologist. “Suddenly you’re pulling people off the street and arresting them because you’re gay, because the computer said so.”  No country has embraced facial recognition and AI surveillance as keenly as China. The AI industry there has flourished thanks to fierce competition and unrivaled access to personal data, and the rise of AI is enabling tighter government control of information, speech, and freedoms. In some Chinese cities, facial recognition is used to catch criminals in surveillance footage, and to publicly shame those who commit minor offenses. Most troubling, AI is being used in Xinjiang, a province in Western China, to persecute Muslims. Even if China’s AI capabilities are exaggerated, the AI boom there is having a chilling effect on personal freedom, says Ian Bremmer, an expert on global political risk and founder of the Eurasia Group. “You just need a government that is starting to get that capacity and make it known, and have a few people that are sort of strung up as examples, and suddenly everyone is scared,” he says. This might feel like a distant reality, but similar tools are being developed and used in the West. Just ask Glenn Rodriguez, who faced judgment from an algorithm when seeking parole from prison in the US. Despite 10 years of good behavior, Rodriguez saw how an algorithm called COMPAS, designed to predict inmates’ likelihood of reoffending, would be biased against him. And even though the parole board went against the computer program’s  
5. According to paragraph 3, which job is NOT performed by AI surveillance system in China?

Hãy suy nghĩ và trả lời câu hỏi trước khi xem đáp án

ZUNIA12
ZUNIA9
AANETWORK