Chinese company New Senyang’s Q11 wireless earbuds promise live translation of a whopping 156 languages "in seconds" via artificial intelligence, for around the same price as two cups of coffee.

Purchased from Temu for only $11.60, the headphones are uncomfortable to wear and don’t have great sound quality.

But surprisingly enough, the translation provided by their accompanying software actually works — some of the time, at least.

The Q11 earbuds are also 97 per cent cheaper than Apple’s latest AirPods Pro 3, which the tech giant introduced in September for $429.

Apple’s latest in-ear headphones and software updates heralded the arrival of a beta version of the company’s on-device live translation feature, enabled by AI software which its marketing team calls Apple Intelligence.

However, the feature still only supports a measly 11 languages at the time of publication: Chinese (both simplified and traditional Mandarin), English (both UK and US), French, German, Italian, Japanese, Korean, Portuguese, and Spanish.

While free and powerful translation apps have existed for decades, improvements in machine learning and generative AI have made translating between languages both quicker and more accurate — so adding live AI-based translation to headphones has been an obvious market move.

But processing translations quickly, accurately, and privately comes at a cost — more than $1,000, in Apple’s case.

Comparing Apples and oranges

To use Apple’s live translation feature, you’ll need a pair of AirPods Pro 3, AirPods Pro 2, or the slightly cheaper AirPods 4 with active noise cancellation — the latter of which costs $299.

You’ll also need an iPhone with Apple Intelligence, which means an iPhone 15 Pro or later.

The cheapest Apple Intelligence-enabled iPhone which Apple currently sells is the $999 iPhone 16e.

So, at minimum, you’ll need to have spent around $1,300 to access Apple’s live translation feature, which still remains in beta.


New Senyang's Q11 headphones (left) cost around $12, while Apple's AirPods Pro 3 (right) cost $429. Image: Tom Williams / Information Age

You’ll also need to download languages individually onto your iPhone, and they can take up around 1GB of storage each.

This allows translations to run locally on your device, instead of being sent through the internet to a mystery data centre for processing.

Other Big Tech manufacturers such as Google and Samsung have similar AI-based live translation features on their own devices, which can also be used offline if languages are pre-downloaded.

The perks of only running live translations through an on-device AI model means Apple’s translations are processed entirely on the user's iPhone, which the company says “ensures your conversations stay private”.

On the other hand, the New Senyang Q11 earbuds rely on a free iOS and Android app called TransKit, which requires an internet connection and sends data to a server for processing.

According to TransKit's user agreement and privacy policy, my personal data is being used by the company to provide “services to you, diagnosing application problems, providing customer care and support services, and improving the application”.

Information about my mobile device, its network, and how I use the app is also being stored “for as long as I use this application”, according to the policies.

Users can withdraw their consent for this data collection, but the email address to contact the “data protection officer” is strangely just a standard Gmail account.

The policies also state that my contacts, photos, and text messages are not collected by the app — which is a relief, but also feels like a weird thing to explicitly mention.

I had assumed the app wouldn’t be taking that kind of private information anyway, but now I’m a little less sure.


Look familiar? The New Senyang Q11 and Apple AirPods Pro 3 have a similar form factor when closed. Image: Tom Williams / Information Age

Nothing’s perfect in the land of live translation

Being processed off-device, TransKit’s translations into text were almost instant in my testing, but not always reliable.

Its generation of live-translated audio in the Q11 earbuds was also slower than Apple’s when I conversed with French and German speakers, often taking almost 30 seconds to generate audio for longer phrases.

Words are also frequently missed, translations hang and remain stuck, and the audio often jumps around erratically and misses whole parts of the conversation.

This is likely because the Q11s rely on an internet connection and external processing, and don’t have great microphones.

Apple’s live translations aren’t perfect either.

The company even admits in one disclaimer: “Translate should not be relied on in circumstances where you could be harmed or injured, in high-risk situations, for navigation, or for the diagnosis or treatment of any medical condition."

But translations through Apple Intelligence are still faster, more consistently accurate, and less likely to get stuck or jump around than those from TransKit and the cheap Temu earbuds — which isn't a great surprise given the price difference.

Initiated by long-pressing on both AirPods simultaneously, Apple’s live translation feature sits in its Translate app — which does have some drawbacks.

You can’t save a transcript of a conversation or copy text from it like you can on TransKit and many other translation apps.


Apple integrates live translations into its Translate app, as well as in Messages, FaceTime, and the Phone app. Images: Apple

However, Apple has integrated live translations in similarly clever ways to both Samsung and Google.

Users can get live translations in the Messages app (translating text into text), during phone calls (translating into both live audio and text), and FaceTime calls (with live captions as people speak).

Noise cancellation on AirPods also lowers the volume of the person you're speaking with, to make it easier to hear the translation of what they’re saying.

Where Apple falls behind is the number of languages it supports; Google's live AI translations can handle more than 70 languages, and Samsung's over 25.

While live translation is obviously not the AirPods’ only useful feature — and not the main reason people buy them — it’s a nice-to-have capability which will no doubt improve as Apple adds more languages, improves its AI models, and brings the feature out of beta.

Until then, it still feels like the company is playing a bit of catch-up to Samsung and Google when it comes to integrating useful generative AI features.

Apple provided Information Age with AirPods Pro 3 and loaned an iPhone Air for testing.