Meta has announced a breakthrough in integrating artificial intelligence into its products, unveiling a feature that will change how users interact with Reel's videos on Instagram. With the help of Meta AI, the advanced artificial intelligence system, videos can be automatically translated and synchronized from different languages into English and Spanish, synchronizing lip movements with the new language. This change aims to overcome language barriers and extend the reach of content creators worldwide.

The announcement was made during the Meta Connect 2024 event, where several demonstrations of this technology were shown. One of the most prominent features is Meta AI's ability to translate the audio of a video into another language and synchronize it with the speaker's lips, providing a more realistic visual experience. The new feature is already being tested in the United States and Latin America in an initial phase that will allow a limited group of content creators to test this technology before its global rollout.

Translation of the rollers

This new feature is the result of years of research and improvements to Meta's artificial intelligence. Using its latest AI model, Llama 3.2, the company has implemented a system that automatically translates the audio of Reels videos and adapts the creator's voice to the selected language. This technology not only changes the language but also adapts the speaker's lip movements, improving the accuracy and naturalness of the translation compared to traditional dubbing methods.

For example, if a creator records a video in Spanish, Meta AI can generate an English version that retains the speaker's original tone and style but uses the new language. This means that the creator retains their vocal identity, allowing for a more authentic connection with viewers of different languages. The technology also synchronizes the author's lips with the words in the target language, greatly enhancing the visual experience.

Results

Several examples of how this technology can transform content creation on Instagram were presented at the Meta Connect event. One of the most notable demonstrations came from content creator Dana Buttigliero, known on social media as Flavors of the Conurbano. In the video, the AI translated a recipe she presented in Spanish into English, perfectly synchronizing her lips with the words in the new language. Although in some cases, small errors were observed in the pronunciation of complex terms, such as "dulce de leche" being interpreted as "dulce de deshi," overall, the results were impressive and signaled an important advance in machine translation technology.

Another developer who participated in the demo was Ivan Acuna, known as artificially.ia, who showed how AI could translate a video with dialog right in front of the camera. In his case, the lip-sync was remarkably smooth, showing that Meta AI has great potential to automate the translation of spoken videos with high accuracy.

Currently, Reels's translation and synchronization feature is in a limited testing phase. In the first version, only a select group of content creators in the US and Latin America have access to this technology. While Meta has not yet announced a specific date for the global launch, the company's CEO, Mark Zuckerberg, has said that the feature's availability will be expanded to more countries and languages in the coming months.