AI Made Friendly HERE

AI audio-jacking and deep fake phone calls explained by IB

With the explosion of artificial intelligence (AI) in the last 18 months new cybersecurity threats our on the horizon and some are already being use by malicious third parties. Audio-Jacking is a sophisticated form of cyberattack that targets voice communications. By using advanced AI, including Large Language Models, hackers can intercept phone conversations and alter what’s being said in real-time. This could mean inserting unauthorized content into your calls, which could have serious implications for your financial security, personal health information, and overall trust in voice communication.


Imagine picking up your phone and having a conversation with a friend, a family member, or even your bank. Now, imagine that the voice on the other end isn’t who you think it is, but a hacker using cutting-edge artificial intelligence to impersonate them. This is the reality of a new cyber threat known as Audio-Jacking, and it’s something you need to be aware of as we move further into 2024.

The process behind an Audio-Jacking attack is intricate. Cybercriminals use malware to exploit vulnerabilities in Voice Over IP systems to gain access to your calls. Once in, they deploy AI to understand the conversation’s context and make subtle changes to the spoken words. They use technologies like Speech to Text and Text to Speech Translation to analyze and replicate speech patterns, and Deep Fake Technology can even mimic voices to create highly convincing audio frauds.

AI Deep Faking Phone Calls

To hide the time it takes to process these changes, hackers might use social engineering tricks. They could introduce distractions or irrelevant content to cover up any delays, making the alterations to the conversation seem natural and undetectable. This can lead to severe issues, such as financial theft, exposure of sensitive health data, and even censorship.

You must be aware of the risk of impersonation in real-time and approach voice communications with a healthy dose of skepticism. To protect yourself, you need a comprehensive strategy. Be wary of unexpected requests or information you receive over the phone, even if it seems to come from someone you know. By rephrasing sensitive details, you can confuse AI systems that are programmed to recognize certain patterns of data. IBM explain more about the mechanics of AI audio-jacking and deep fake phone calls .

Here are some other articles you may find of interest on the subject of  Cybersecurity in 2024 with new AI threats appearing all the time :

Overview of Audio Jacking

Audio jacking involves an attacker positioning themselves within a conversation to alter or misrepresent exchanged information. This threat vector primarily targets financial transactions but has broader implications, including healthcare, censorship, misinformation, and real-time impersonation.

Techniques and Execution:

  1. Infiltration via Malware: Attackers initiate the process by embedding malware in one participant’s device, creating a foothold for eavesdropping and manipulation.
  2. Exploitation of Communication Protocols: VoIP systems are susceptible to vulnerabilities that attackers exploit to insert themselves into conversations.
  3. Spoofing and Deepfake Technology: Advanced techniques involve spoofing caller IDs and employing deepfake audio to impersonate conversation participants convincingly.

Attack Mechanism:

  • The attack leverages an interceptor to monitor and convert spoken words into text, analyzing the content for sensitive information using natural language processing capabilities of Large Language Models (LLMs).
  • Upon detecting target information, such as bank account details, the attacker alters the audio output, substituting genuine information with fraudulent data, facilitated by text-to-speech technology and deepfake audio matching the victim’s voice.


  • Financial Fraud: Directly targets financial transactions by altering banking details.
  • Healthcare Risks: Misrepresentation of health-related information could pose significant risks to patient safety.
  • Censorship and Disinformation: Altered audio can be used to spread misinformation or censor genuine speech.
  • Impersonation: Real-time voice impersonation can lead to a wide array of deceptive activities.

Mitigation Strategies:

  • Critical Skepticism: Users should approach audio communications with caution, especially when sensitive information is involved.
  • Verification Techniques: Employing paraphrasing and repetition can help verify the integrity of the information exchanged.
  • Diversified Communication: For crucial transactions, using multiple communication methods to share sensitive information can reduce risk.
  • Adherence to Security Best Practices: Regular updates, cautious interaction with emails and applications, and the use of multi-factor authentication enhance defense mechanisms against such sophisticated attacks.

AI audio-jacking underscores the necessity for heightened awareness and advanced security measures in the digital age, where technological advancements continually reshape the cyber threat landscape. As cyber attackers refine their methods, the imperative for robust cybersecurity strategies becomes increasingly critical, demanding constant vigilance and adaptation from individuals and organizations alike.

When you download apps for communication, make sure you’re getting them from reliable sources. This reduces the risk of accidentally getting malware that could lead to an Audio-Jacking attack. Strengthen your defenses by using multi-factor authentication and consider moving away from traditional passwords to passkeys, which offer a more secure way to prove your identity.

The rise of AI technology has brought new challenges to cybersecurity, like Audio-Jacking. But by staying informed and taking proactive steps to secure your communications, you can defend against these hidden dangers. Being vigilant and taking the right protective measures are your best defense in the ever-changing world of cyberscurity.

Filed Under: Guides, Top News

Latest Geeky Gadgets Deals

If you buy something through one of these links, Geeky Gadgets may earn an affiliate commission. Learn about our Disclosure Policy.
Originally Appeared Here

You May Also Like

About the Author:

Early Bird