Binance acknowledged the threat of fake accounts using deepfakes
While it is still possible to distinguish real people from copies on video, soon the AI technologies will become so perfect that even the human eye will not see the substitution
The deepfakes technology used by cryptocurrencies to circumvent verification (KYC) on cryptocurrencies such as Binance will become increasingly sophisticated, Binance's director of security Jimmy Su warned, Cointelegraph writes.
Cases, when fraudsters use this technology in an attempt to bypass the verification process at the exchange, have increased, Su told the publication. He said the criminals find photos of the victim on the Internet and use them to create deepfakes .
He explained that the tools have become so advanced that they can even respond correctly in real-time to audio instructions when verifying an applicant.
"Some checks require the user to, for example, blink his left eye or look left or right, look up or down. Deepfakes today are advanced enough to execute those commands," Su said.
For now, he said, humans can still detect deception, but AI will eventually evolve to the point where it will become impossible for humans to tell the difference between fakes as well. Su acknowledged that this is a very serious problem and for now, he sees the solution only in training users in risk management.
In January, Microsoft founder Bill Gates said that the development of AI is the most important innovation in recent years. And one of the pioneers of neural networks and artificial intelligence technology, Jeffrey Hinton, recently joined those who see such developments as a danger to humanity. He fears that the Internet will be flooded with fake pictures, videos, and texts created by AI and that ordinary people "will no longer be able to know what is true and what is not."