by Contributor | May 19, 2023 12:56 pm
WASHINGTON, D.C. – Today, U.S. Senator Kirsten Gillibrand is calling on the Federal Trade Commission (FTC) to assess the prevalence of artificial intelligence (AI)-related scams targeting older adults. Recent reports suggest that scams using AI-powered technology, including voice clones, chatbots, and “deep fake” videos, are a growing problem and may be used to target vulnerable populations, particularly older Americans.
“As artificial intelligence becomes increasingly widespread, older adults are at particular risk of becoming victims of AI-powered scams,” said Senator Gillibrand. “The FTC must take this threat seriously and provide Congress with a thorough assessment of the prevalence of these scams and its plan to fight them.”
Scammers can use AI-powered technology to create deceptive emails, phone calls, and images. Chatbots can be used to mimic a writing style, find personal information, and generate more convincing fake documents, while voice-cloning technology provides scammers with another avenue for upgraded impersonation. In one recent case, a scammer posing as a kidnapper used voice-cloning technology to duplicate the sounds of a mother’s crying daughter and demand ransom.
Senator Gillibrand is asking the FTC to provide answers to the following questions:
The full text of Senator Gillibrand’s letter to FTC Chair Lina Khan can be found here[1] or below:
Dear Chair Khan:
As members of the Special Committee on Aging, we write to request information on your efforts to protect older Americans from increasing threats posed by artificial intelligence-related (AI) frauds and scams. Combatting frauds and scams has been a longstanding priority for the Committee across annual hearings, the Committee’s fraud hotline, and its fraud book. You recently noted how “generative AI risks turbocharging fraud.” While AI contains significant promise as an innovative technology, it can also be manipulated by malicious actors targeting vulnerable populations, particularly older Americans.
Federal Trade Commission (FTC) warnings have noted that scammers can use AI-powered technology, including voice clones and chatbots, to create deceptive emails, phone calls, and images in order to take advantage of consumers and targeted populations. Recent reports suggest that such scams are a growing problem. Voice-cloning technology in particular may facilitate imposter scams by allowing scammers to closely replicate an individual’s voice using just a short audio sample. In one case, a scammer used this approach to convince an older couple that the scammer was their grandson in desperate need of money to make bail, and the couple almost lost $9,400 before a bank official alerted them to the potential fraud. Similarly, in Arizona, a scammer posing as a kidnapper used voice-cloning technology to duplicate the sounds of a mother’s crying daughter and demand ransom.
Chatbots can also be used to mimic a writing style, find personal information, and generate more convincing fake documents, while “deep fake” videos and other AI-generated images can provide scammers with another avenue for upgraded impersonation. For older Americans, targeted by countless scams every year that result in multimillion-dollar financial losses, anxiety, and even anguish, this threat of powerful, newly enhanced fraud is acute.
As the FTC considers reasonable strategies to safeguard older Americans from frauds and scams, we request that you provide the following information by
June 20th, 2023:
Thank you for your attention to this important issue. We look forward to your response.
Subscribe to get the latest posts sent to your email.
Source URL: https://oswegocountytoday.com/politics/gillibrand/gillibrand-calls-for-investigation-of-ai-related-senior-scams/
Copyright ©2026 Oswego County Today unless otherwise noted.