[ad_1]
The FCC’s war against robocalls has gained a new weapon in its arsenal with the declaration of AI-generated voices as “artificial” and therefore definitely against the law when used in robocall scams. It might not stop the flood of fake Joe Bidens that are sure to mess up our phones this election season, but it won’t hurt either.
The new rule, considered for months and telegraphed last weekis not really a new rule – the FCC can’t just make them up without due process. Robocalls are just a new term for something that is already largely prohibited by telephone consumer protection law: artificial, pre-recorded messages are sent willy-nilly to every number in the phone book (some something that still existed when they wrote the law).
The question was whether an AI-cloned voice speaking a script falls into these prohibited categories. This may seem obvious to you, but nothing is obvious to the federal government by design (and sometimes for other reasons), and the FCC needed to look into this and seek expert advice on the matter whether AI-generated voice calls should be banned.
This was likely motivated by the high-profile (but stupid) case last week of a fake President Biden calling New Hampshire citizens and telling them not to waste their primary vote. The shady operations that tried to achieve this are being held up as examples, with attorneys general and the FCC, and perhaps other authorities to come, more or less pillorying them in an effort to deter others.
As we wrote, the call would not have been legal even if it was a Biden impersonator or a cleverly manipulated recording. This is still an illegal robocall and likely a form of voter suppression (although no charges have been filed yet), so there was no problem tailoring it to the existing definitions of illegality.
But these cases, whether brought by states or federal agencies, must be supported by evidence in order to go to trial. Before today, using an AI voice clone of the president may have been illegal in some respects, but not specifically in the context of robocalls – an AI voice clone of your doctor informing you that your appointment is coming up wouldn’t be a problem, for example. (It’s important to note that you probably would have opted for this option.) After today, however, the fact that the voice in the call was an AI-generated fake would constitute a point against the accused over the course of the legal procedure.
Here is an extract from the declaratory decision:
Our findings will deter negative uses of AI and ensure that consumers are fully protected by the TCPA when they receive such calls. And it also makes clear that the TCPA allows no exclusion of technologies that purport to provide the equivalent of a live agent, thereby preventing unscrupulous companies from attempting to exploit any perceived ambiguity in our TCPA rules. Although voice cloning and other uses of AI in calls are still evolving, we have already seen their use in ways that can be particularly harmful to consumers and those whose voices are cloned. Voice cloning can convince a called person that someone they trust, or someone they care about, such as a family member, wants or needs them to take actions they would not otherwise take. Requiring consent for such calls gives consumers the right not to receive such calls or, if they do, the knowledge that they should be careful about them.
It’s an interesting lesson in how legal concepts are sometimes designed to be flexible and easy to adapt – although there is a process involved and the FCC cannot arbitrarily change the definition (there are obstacles to that), once the need is clear, there is no need to consult Congress, the President, or anyone else. As a subject matter expert body, they are empowered to conduct research and make these decisions.
Moreover, this extremely important capacity is threatened by an imminent Supreme Court decision that, if it goes in the direction that some fear, would overturn decades of precedent and paralyze American regulatory agencies. Great news if you like robocalls and polluted rivers!
If you receive one of these AI-powered robocalls, try recording it and reporting it to the local attorney general’s office. He is probably part of the recently created anti-robocall league to coordinate the fight against these scammers.
[ad_2]