Lingo Telecom, a US-based voice service provider, has agreed to a $1 million settlement with the Federal Communications Commission (FCC). The company was accused of facilitating AI-powered robocalls that influenced the 2024 New Hampshire primary election voters.
On January 21, 2024, numerous citizens of New Hampshire were intimidated by phone calls with a voice resembling the voice of President Joe Biden. The calls, which were automated, told the targets that voting in the state’s presidential primary would lead to their disenfranchisement in the November general election. The voice was recorded using a fake deep voice that imitates Joe Biden using the deep learning algorithm.
Lingo Telecom faces penalties and compliance measures
The operation was conducted by political strategist Steve Kramer, who said that the aim was not to influence the election but to illustrate how AI could be dangerous. Kramer’s actions, however, had potential adverse effects, providing false information to the voters and posing a threat to the fairness of the election. Lingo Telecom’s technology was used in this process because it facilitated the sending of these messages to the people in question.
Under the settlement, Lingo Telecom will have to pay $1 million as a civil penalty while the FCC had wanted the company to pay $2 million. The company is also mandated to have a robust compliance plan, which includes compliance with the FCC’s STIR/SHAKEN caller ID authentication framework.
This framework is an attempt to address the problem of caller ID spoofing. This was employed in the robocall campaign to make the calls look as if they were coming from other legitimate numbers.
The FCC has also stressed that this settlement is key in sending a strong message to other companies that are involved in such activities or who facilitate such activities. Lingo Telecom, although accepting the settlement, has strongly criticized the actions of the FCC, stating that the regulator is trying to introduce new rules retroactively.
FCC official highlights dangers of AI voice cloning in elections
Loyaan Egal, the Chief of the FCC Enforcement Bureau, has noted that impersonation with caller ID spoofing and AI voice cloning is very dangerous, especially during the election period. Egal noted that these technologies can be misused by state actors in pursuit of political agenda or by foreign actors intending to meddle in the political processes of the country.
Steve Kramer, the political consultant who made the robocalls, is facing even graver consequences. The FCC has proposed a $6 million fine against Kramer, who could also face up to seven years in prison on charges of voter suppression.
In addition, Kramer may also be sentenced to one year for impersonation of a political candidate. The charges depict how the authorities are taking this case, seeing it as a clear example of the risks that come with using AI in politics.
Robert Weissman, co-president of the non-profit organization Public Citizen, welcomed the FCC’s actions in this instance, saying that deepfakes posed an “existential threat to our democracy.”
Earn more PRC tokens by sharing this post. Copy and paste the URL below and share to friends, when they click and visit Parrot Coin website you earn: https://parrotcoin.net0
PRC Comment Policy
Your comments MUST BE constructive with vivid and clear suggestion relating to the post.
Your comments MUST NOT be less than 5 words.
Do NOT in any way copy/duplicate or transmit another members comment and paste to earn. Members who indulge themselves copying and duplicating comments, their earnings would be wiped out totally as a warning and Account deactivated if the user continue the act.
Parrot Coin does not pay for exclamatory comments Such as hahaha, nice one, wow, congrats, lmao, lol, etc are strictly forbidden and disallowed. Kindly adhere to this rule.
Constructive REPLY to comments is allowed