Yazılar

Denmark Moves to Ban AI Deepfakes, Giving Citizens Copyright Over Their Own Likeness

Denmark is preparing to pass one of the world’s toughest laws against AI-generated deepfakes, aiming to give citizens new legal rights over their appearance, voice, and likeness online. The bill — expected to pass early next year — would make it illegal to share or distribute deepfake content without a person’s consent, extending copyright protections to individuals.

The proposed legislation follows growing concern about the rapid spread of deepfakes — hyper-realistic AI-generated videos, images, or audio that impersonate real people. Danish Culture Minister Jakob Engel-Schmidt said the move is essential to protect both private citizens and democracy itself, warning that political deepfakes could “undermine our democracy” by spreading falsehoods.

Under the new law, Danes would be able to demand takedowns of AI-generated content that misuses their likeness, while parody and satire would remain protected. Major tech platforms that fail to remove harmful deepfakes could face significant fines, although individuals are unlikely to face criminal penalties.

Experts have praised the move as a landmark step. “When people ask, ‘what can I do to protect myself from being deepfaked,’ the answer right now is basically nothing,” said Henry Ajder, a generative AI researcher and founder of Latent Space Advisory. “Denmark is one of the first governments to change that.”

The Danish proposal mirrors similar measures abroad. The United States recently criminalized the sharing of non-consensual intimate deepfakes, while South Korea introduced harsh penalties for deepfake pornography. Denmark’s initiative could now influence European Union policy, with France and Ireland reportedly showing interest in adopting similar laws.

For victims like Marie Watson, a Danish video game streamer whose photos were digitally altered and shared online, the legislation comes too late to undo the damage but offers hope for future protection. “When it’s online, you’re done. You can’t do anything,” she said. “It’s out of your control.”

U.S. Lawmakers Warn UK: Encryption Backdoor Order to Apple Threatens Global Cybersecurity

Senior U.S. lawmakers have expressed sharp criticism over the United Kingdom’s order requiring Apple to create a backdoor into its end-to-end encrypted services, warning that such a move could weaken global cybersecurity and violate privacy rights.

What Happened?

  • U.S. House Judiciary Chair Jim Jordan and Foreign Affairs Chair Brian Mast sent a joint letter to UK Home Secretary Yvette Cooper, urging a reconsideration of the order, known as a Technical Capability Notice (TCN).

  • The TCN compels Apple to make encrypted user data accessible to UK authorities, prompting Apple to withdraw its Advanced Data Protection feature in the UK earlier this year.

Creating a backdoor… introduces systemic vulnerabilities that can be exploited by cybercriminals and authoritarian regimes,” the lawmakers warned.

Key Concerns from U.S. Lawmakers:

  • Global Implications: Because Apple serves users worldwide, any security backdoor would have ramifications for U.S. citizens and others outside the UK.

  • International Law Violation? The lawmakers argue the UK’s TCN may breach the U.S.-UK CLOUD Act agreement, which prohibits orders requiring decryption.

  • Secrecy and Transparency Issues: UK law forbids Apple from disclosing the existence of the order—even to the U.S. Department of Justice, its own home government.

  • Human Rights Risk: The TCN “conflicts with international human rights standards,” they said, citing European Court of Human Rights precedent protecting encryption under the right to privacy.

Apple’s Position:

Apple has consistently refused to build backdoors into its devices, stating that doing so would compromise the security of all users, not just those under investigation. The company is challenging the TCN at the UK’s Investigatory Powers Tribunal.

UK Government Response:

The Home Office maintains that access to individual data would still require a separate judicial warrant, not blanket access. However, critics argue that weakening encryption—even with controls—creates irreparable security risks.