Los deepfakes también tienen el potencial de

18 人阅读 | 0 人回复

发表于 2024-3-12 12:31:04 | 显示全部楼层 |阅读模式


Una buena falsificación, creada mediante Inteligencia Artificial que ha sido entrenada en horas de metraje, ha sido generada específicamente para su contexto, con movimientos de boca y cabeza sin fisuras y coloración apropiada. Simplemente superponer una cabeza a un cuerpo y animarlo a mano puede llevar a grandes desajustes de contexto muy evidentes para el ojo humano. Inmediatamente, comenzaron las especulaciones y la preocupación sobre los posibles usos más amplios de la tecnología. A still from a YouTube demo of 'Synthesizing Obama'. Un fotograma de una demostración en YouTube de 'Sintetizando a Obama'.

Supasorn Suwajanakorn/YouTube The videos raised concerns about possible future uses of the technology and its ethics. The question of consent was immediately raised. More alarming Sweden Mobile Number List was the potential for blackmail and the application of technology to those in power . In 2017, months before pornographic deepfakes emerged, a team of researchers at the University of Washington made headlines when they published a computer-generated video of Barack Obama speaking from old audio or video clips.



At the time, the risks around the spread of misinformation were clear, but they seemed very distant , given that these were academic researchers who were producing these videos. The user-level creations added an alarming urgency to the risks at hand. In January 2018, a deepfake creation desktop app called FakeApp was released, allowing almost anyone to create deepfake videos. A thread was even created on Reddit with dozens of videos of this type. Manipulated Obama video AP In January 2018, shortly after pornographic deepfakes appeared, FakeApp, a desktop application for creating deepfakes, became available for download .
回复

使用道具 举报

您需要登录后才可以回帖 登录 | 立即注册

本版积分规则

jinathjemi77777

发表主题 1

发帖