Skip to main content

Unbabel for Video launches to translate your videos in seconds

Image Credit: Pixabay

You don’t need to launch your product or service outside the U.S. for translation services to become important.

In the U.S., Hispanic viewers are ahead of the curve when it comes to digital. They lead in adoption of new devices. The average Hispanic person in the U.S. spends more than eight hours watching online video each month. That’s more than 90 minutes longer than the U.S. average, according to a Nielsen report.

But how do you translate your lucrative, high-converting video content into Spanish, or any other language, for that matter?

Today, Unbabel — the Y Combinator-backed startup that combines machine learning with crowdsourced human translations — has announced the launch of Unbabel for Video to solve this exact problem.


June 5th: The AI Audit in NYC

Join us next week in NYC to engage with top executive leaders, delving into strategies for auditing AI models to ensure fairness, optimal performance, and ethical compliance across diverse organizations. Secure your attendance for this exclusive invite-only event.


“We’ve seen the results of translating videos through the experiences of YouTube channels such as VICE,” Vasco Pedro, CEO and cofounder of Unbabel, told me in an interview at CapitalFest in Amsterdam. “After translating videos into Spanish and Portuguese, it more than doubled the daily watch time from speakers of those languages. In fact, watch time from [Spanish-speaking] users tripled.”

Here’s how it works.

Powered by its proprietary technology platform, which combines neural machine translation, artificial intelligence, quality estimation algorithms, and a global network of more than 50,000 bilingual post-editors, the new service transcribes and translates video and audio content, returning searchable, time-stamped text in dozens of languages.

In short, Unbabel for Video first attempts a high-quality automatic translation using the Unbabel platform. These outputs are then distributed to the Unbabel community, which use the system to review and edit thousands of words per day.

How has Unbabel managed to build such a big post-editor community?

“When we launch in a different country we create and promote content that attracts a seed group of 10 to 20 people,” Pedro said. “From there, we have a program that allows those first post-editors to refer their friends to the program. Of course, we then put these applicants through a series of tests and certifications before accepting them as translators.”

This community of human translators and the speed with which they are employed by the system makes Unbabel an attractive solution for quick and accurate translation. This new API transcribes and translates video automatically.

Pedro showed me the API working within a video app that the company has prepared for demonstration purposes. We recorded a short video that the system transcribed within a few seconds, creating subtitles for the video in English. Pedro then showed me instant translations in Spanish, Portuguese, Chinese, and Japanese. The transcription wasn’t perfect, which means the translations weren’t either, but with the click of a button, the video was pushed out to the post-editor community.

Within a few minutes, we had perfect subtitles in five languages. The API supports 28.

It is important to note that Unbabel for Video is a B2B solution. The company has no immediate plans to make this technology available to consumers.

“You need a lot of money to go after consumers,” Pedro said. “And, frankly, we don’t have expertise in consumer-facing products. Consumers are also less willing to pay for transcription and translation services, whereas businesses can use our product to dramatically reduce the costs and resources required to translate business content, including video.”

And that’s the key to this system. In essence, Unbabel is providing the kind of translation service you’d expect from a high-end agency, but at a lower cost and with a much faster turnaround.

Unbabel for Video is available from today.