Meta, the company behind Facebook, Instagram, Messenger, and other platforms, has announced significant enhancements to its AI-powered assistant, Meta AI. Starting today, users can access new features, such as the ability to communicate in more languages and create stylized selfies. Additionally, Meta AI now allows users to direct their questions to the company's most advanced AI model, Llama 3.1 405B, which excels at handling more complex queries than its predecessor.
The initial launch of Meta AI did not meet expectations, according to several critics, who found that the assistant struggled with basic tasks like searching for recipes or airfare. With the introduction of Llama 3.1 405B, Meta aims to improve accuracy in math and programming, making it easier to solve math problems, explain scientific concepts, and debug code. However, using Llama 3.1 405B comes with a limitation: users must manually activate the model and are restricted to a limited number of queries per week before Meta AI automatically switches to a less advanced model, Llama 3.1 70B. Additionally, EU citizens do not have access to this new technology, as Meta decided not to launch it there due to the "unpredictable nature of the European regulatory framework."
Generative Selfies and Editing Tools
In addition to Llama 3.1 405B, Meta has introduced a new generative AI model called "Imagine Yourself," which allows users to create stylized images based on a photo and a prompt, such as "Imagine me surfing" or "Imagine me on a beach vacation." This feature, currently in beta, is activated by typing "Imagine me" followed by the desired description, excluding inappropriate content. Meta has not specified the data used to train "Imagine Yourself," but its terms of use indicate that public posts and images on its platforms are accessible for this purpose. This policy has raised some concerns among users, particularly about the complicated opt-out process.
Alongside "Imagine Yourself," Meta AI also includes new editing tools that allow users to add, remove, or change objects in images with commands like "Change the cat to a corgi." Starting next month, an "Edit with AI" button will be available, offering additional fine-tuning options.
New Languages and Quest
Meta AI is also set to replace the voice command feature of the Meta Quest VR headset next month in the United States and Canada, in an experimental mode. Users will be able to use Meta AI to ask questions about their physical surroundings, for example, "Look and tell me what kind of top would complete this outfit" while holding up a pair of shorts.
Meta AI is now available in 22 countries, including new markets such as Argentina, Chile, Colombia, Ecuador, Mexico, Peru, and Cameroon. The assistant has also expanded its language support to include French, German, Hindi, Romanized Hindi Script, Italian, Portuguese, and Spanish, with more languages promised soon. These updates represent Meta's effort to enhance the functionality and accessibility of its AI assistant, while aiming to overcome initial criticisms and provide a more robust and useful experience for its users worldwide.