The Federal Trade Commission (FTC) is seeking to update regulations surrounding the creation of deepfakes and impersonations of businesses, government agencies, or celebrities to trick consumers.
The final language of the regulation for generative AI platforms could make it illegal to offer services that could be used to harm consumers through impersonation. “With voice cloning and other AI-driven scams on the rise, protecting Americans from impersonator fraud is more critical than ever,” FTC Chair Lina Khan said in a press statement.
“Our proposed expansions to the final impersonation rule would do just that, strengthening the FTC’s toolkit to address AI-enabled scams impersonating individuals.”
The updated rules for government and business impersonation would empower the FTC to initiate federal court cases to directly compel scammers to return illegally acquired funds. For example, a deepfake of a celebrity like Oprah hawking vitamin pills could be pursued for impersonation. There are no federal laws that address the sharing or creation of deepfake images and videos—though these personalities have resource through copyright laws, publicity rights, and other protections.
The final rule will become effective 30 days after publication in the Federal Register. The public comment period for the proposed rule will be open for 60 days following the date it is published—so the FTC can monitor public feedback to the rule change.
It’s worth noting that the pace of genAI development has grown exponentially since 2023. On February 15, OpenAI teased a new text-to-video genAI engine it is calling ‘Sora’ which can take text prompts and generate a semi-realistic video. OpenAI says it has no plans to make its video model broadly available ‘yet’—but that doesn’t mean it won’t be up and running.
“We’re teaching AI to understand and simulate the physical world in motion, with the goal of training models that help people solve problems that require real-world interaction,” OpenAI said in a statement about the new technology. Videos shared on X/Twitter highlight the AI struggles with that, however. In one video prompt that suggests archaeologists are digging up a plastic chair in the future—the chair moves around like a towel in the breeze.
>>> Read full article>>>
Copyright for syndicated content belongs to the linked Source : DigitalMusicNews – https://www.digitalmusicnews.com/2024/02/16/ftc-steps-up-pressure-on-ai-deepfakes-impersonations/