On October 27, CCTV News reported last night that the reporter found that on some online trading platforms, thousands of face photos can be purchased for 2 yuan. If these photos fall into the hands of criminals, they may be used for precise fraud or even money laundering, gang-related crimes, etc. For example, using illegally obtained photos, forging other people's dynamic videos to crack the face recognition system, and defrauding accounts and property. The news immediately sparked a heated discussion, with netizens saying it was "terrible" and "no privacy at all", and many people shared their personal experiences of image information leaks on social platforms. According to CCTV News, the APP Special Governance Working Group established by the National Information Security Standardization Technical Committee and others recently released a "Public Survey Report on Facial Recognition Applications 2020", stating that 90% of respondents have used facial recognition, 60% believe that facial recognition has a tendency to be abused, and 30% said that their personal privacy or property security has been damaged as a result. In recent years, as face-scanning payment and face-checking applications have entered various mobile apps and real-life scenarios, the phenomena of face information leakage, face image abuse, AI face-changing and other illegal activities have been exposed one after another, causing people's concerns. Although relevant departments continue to report on cracking down on illegal activities, there are still criminals who take risks. The more powerful AI technology is, the more tragic the price will be if it falls into the hands of criminals. How secure is facial recognition? What shadow areas exist behind the rapid development of facial recognition technology? How can they be solved? These questions are like a sword hanging over the hearts of many people, preventing them from using convenient technology. 1. 5,000 face photos for less than 10 yuan Privacy exposed CCTV reported that some facial information was leaked by App operators and technology developers, resulting in the abuse of facial information and even the creation of a black industry chain. "On the online trading platform, you can buy thousands of photos of facial information for only 2 yuan, and 5,000 photos of faces for less than 10 yuan. Browsing the photo library of the merchant, there are real life photos, selfies and other private photos. These photos fall into the hands of criminals and are likely to be used for precision fraud or even money laundering, gang-related crimes and other illegal and criminal incidents." Seeing such reports, many netizens felt that their pain points were touched and they were furious. Someone quickly shared that he had a similar experience, revealing that his friend was defrauded of nearly 100,000 yuan because he believed the personal information of his college roommate given by an unscrupulous business, which was exactly the same as the real thing. Some people strongly called on relevant departments to conduct an investigation and also check those applications that seem to be full of loopholes, such as "fortune telling by face recognition". Some people, from the user's own perspective, proposed coping strategies such as "regularly cleaning up profile photo albums" and "not using the same nickname on different social platforms" to improve their own prevention capabilities, which received a lot of likes. It can be seen that as AI enters the lives of ordinary people, data privacy concerns have become a "thorn" in people's hearts. Once a similar incident breaks out, it will cause a strong irritation reaction among people. 2. Routine: Using illegally obtained photos to AI-swap faces and deceive "face recognition" technology Is facial recognition safe? Will there be information leakage? Many people are worried. According to CCTV news reports, on August 13 this year, the public security department of Qiantang New District, Hangzhou, arrested two suspects who stole personal information. They used technical means to deceive the platform's facial recognition and stole thousands of platform account personal information on multiple online platforms, preparing to resell them at a price of 80 to 100 yuan per order. At the beginning of this year, a group of people in Quzhou, Zhejiang, were arrested for using technical means to deceive Alipay's facial recognition authentication and illegally profited tens of thousands of yuan by using citizens' personal information to register Alipay accounts. CCTV pointed out that it is worth noting that at the beginning of this year, in the above two cases, the suspects used AI face-changing technology to illegally obtain citizens' photos for certain pre-processing, and then used the "photo activation" software to generate dynamic videos to deceive the face verification mechanism. Subsequently, they logged in to various network service platforms to register members or conduct real-name authentication through private social platform accounts purchased in bulk online. It is worth mentioning that the implementation of AI technology is still in its early stages worldwide, including in the US presidential election, the use of international big-name Amazon Ring home cameras and other scenarios, where data privacy leaks and AI face-changing fraud have occurred. ("Amazon and Google home cameras frequently fail, how did the baby-watching device become the "devil's eye?", "Fake videos of the US presidential election are flying everywhere! A complete revelation of the offensive and defensive battle of facial technology") 3. The technical threshold for AI black market is not high, and the industry chain is already very complete As early as July 2019, Zhidongxi also visited and investigated the AI face-changing black industry. 100 yuan for 200 face-changing pornographic films, including domestic first- and second-tier female stars, 5 photos can help you customize face-changing videos, 400 yuan can buy face-changing software and tutorials, and it is guaranteed to teach you... Under the astonishing transactions, the AI face-changing black industry has formed a complete industrial chain. ("AI face-changing black industry: 100 yuan for 200 face-changing pornographic films, 5 photos can customize videos") Based on a simple deep learning algorithm, you only need photos or videos, and through this algorithm, you can replace the face in them with the face you want to replace. The operation is relatively simple, and the realistic effect of the replaced video or photo is shocking. ▲ Different sellers set prices for finished videos, but the technical threshold behind them is very low. Get the deepfakes code that was open sourced as early as 2017 on Github, enter a library of materials that meet the deep learning task for face replacement, and you can replace Zhu Yin's face with Yang Mi. In August 2019, a software called "ZAO" became popular, allowing the public to understand the magic of AI face replacement, but also poked the pain of data privacy. Because ZAO did not prevent users from abusing other people's photos for pranks and forcibly obtained users' photo usage rights, it fell from the second place in the App Store after only three days of going online and was forced to go offline. ("The truth about the ZAO face-changing APP screen-sweeping: Momo created it, and the technology was used for erotic videos") In the report on the ability of machines to uncover the black industry of face recognition, some people even set up a studio by teaching face recognition technology, cracking orders and earning 30,000 yuan a month, and then opened up a model of "800 yuan to teach black technology to pass". This black industry includes a complete industrial chain including real-name account resellers, authentication agencies, and facial photo information providers. 4. Data storage is critical and requires a multi-pronged approach including technology and regulations CCTV pointed out that the more worrying risk point of face recognition at present is the storage of face information. Where is face information stored? Experts interviewed by CCTV believe that due to the variety of face recognition applications and the lack of unified standards, a large amount of face data is stored in the central database of the application operator or technology provider. Whether the data is desensitized, whether the security is in place, and which ones will be shared with partners are all unknown to the outside world. Moreover, once the server is hacked, highly sensitive face information is at risk of being leaked. In order to plug the loopholes, experts proposed layered authorization, distributed storage, digital desensitization and encryption. At the same time, the face recognition application industry also needs to establish strict industry standards to deal with risks. Operators and technology developers cannot become isolated islands, only seeking technological upgrades without paying attention to privacy risks. Stricter industry standards and legal supervision are needed. In fact, relevant laws and regulations in my country have already begun to make relevant provisions. Referring to the Civil Code, the collection of natural person information should follow the principle of "notification-consent", obtain the consent of the natural person, and the person whose information is collected also has the right to withdraw. The "Information Protection Law of the People's Republic of China (Draft)" which is currently soliciting public opinions proposes that the installation of image collection and personal identity recognition equipment in public places should be necessary to maintain public safety, comply with relevant national regulations, and set up prominent warning signs. 5. Conclusion: The facial recognition market needs to be regulated With the development of technologies such as face recognition, the two-sided nature of technology has also emerged. The proper application of data and algorithms makes people's lives more convenient, but abuse makes people seem to be running naked in AI. The data privacy issues in the implementation of AI are often kept secret, but the last snowflake that triggers the avalanche is worthy of people's vigilance. To this end, the industry needs application developers, operators, and relevant regulatory departments to perform their respective duties, keep up with the pace of AI technology offense and defense, and win the battle against AI abuse. |
<<: Good news! 5G is so important, China Mobile has made a great contribution
Since it is the virtualization of the host networ...
It has been a few months since I shared informati...
What is Link Aggregation? Link aggregation is a t...
[[338650]] Recently, the United States announced ...
As enterprises develop their network strategies a...
Hello everyone, I am the island owner Xiaofeng. T...
LOCVPS is a long-established Chinese hosting comp...
[[405404]] During the Dragon Boat Festival holida...
CloudSilk is a domestic hosting company establish...
I just don't love you anymore, a song that ca...
HawkHost basically shares the discount once a yea...
I believe everyone can feel that using multithrea...
With the advent of Industry 4.0, the Industrial I...
Kuroit is a foreign hosting company founded in 20...
Hosteons released the OpenVZ 7 VPS Migration to K...