Passing off a course in Aravaca as a master’s degree at Harvard is no longer the most disconcerting thing that human resources technicians can face. Now, these professionals must be more attentive than ever to the body language of their interviewees through video calls, and not to detect certain aspects of their personality through them, but simply to find out if they are real people.
According to a recent FBI report, the US investigative agency is receiving an increasing number of complaints from companies claiming that one or more of the candidates in their selection processes have used deepfake technology to impersonate other people during interviews and thus have more chances of getting the job.
Why? According to the FBI, the purpose of these actions has little to do with the workplace. The requested positions are usually related to ICT, databases or programming, through which you have access to personal information of the company’s clients and employees or to financial data of the organization. Therefore, the objective of these impostors is none other than to steal that information to use it for illicit purposes.
The US investigative agency points out in its report that it has received a multitude of complaints of this type recently, especially from technology companies, although it has not specified how many of these impersonation attempts were detected in time and how many managed to overcome the different phases of the process of selection and access to confidential company data.
What stalls? As is evident, for a fraud of this nature to be possible, the positions to which they are directed must be completely remote, and the entire selection process must be carried out remotely through emails, calls and videoconferences. This, added to the fact that ICT professionals usually have greater access to sensitive data of the organizations in which they work, means that many of the impersonation attempts have been carried out in applications for technological jobs.
How to detect them? The FBI points out that some of the companies that were aware of the impersonation during the selection process detected the deception because the voice and image did not correspond at certain times, for example, when the candidate coughed but the image did not, not even with a delay . Other companies discovered the hoax thanks to the use of deepfake detection software.
Despite the fact that some companies managed to detect them in time, discovering that you are facing a deepfake is not easy if it is well managed and you are not alert to the possibility of this happening. Poor quality, slow, laggy or choppy video calls are quite common, so potential spoofing failures can be identified with any of these issues and overlooked.
On the other hand, specialized deepfake detection software is still not entirely accurate. According to a study by Carnegie Mellon University in the United States, the effectiveness of tools of this type that currently exist ranges between 30 and 97%.
Image | Wocintechchat