Meta AI is designed to understand your needs, so its answers are more helpful. It’s easy to communicate, so interactions are more fluid and natural. It’s more social, so it can show you the people and places you care about. You can also use Meta AI’s voice capabilities while multitasking and doing other things on your device, and there’s a visible icon to remind you when the microphone is in use.
Hey, Meta, let’s chatWhile talking to AI is nothing new, we’ve improved the underlying model in Llama 4 to give you more personalized, relevant, and friendly responses. In addition, the app has integrated other Meta AI features, such as image generation and editing, and now you can do all of this with your AI assistant through voice or text conversations.
We also have a voice demo based on full-duplex voice technology that you can turn on or off for testing. This technology will provide a more natural voice experience and is trained for conversation, so the AI can generate speech directly without reading written responses. It can’t access web pages or real-time information, but we hope to give users a glimpse of the future development direction through first-hand experience. You may encounter technical problems or inconsistencies, so we are constantly collecting feedback to help us continuously improve the experience. Voice conversations, including a full-duplex demo, are available now in the US, Canada, Australia, and New Zealand. For more information on how to manage your experience on the Meta AI app and how to switch modes, visit our Help Center.
Meta AI uses Llama 4 to help you solve problems, answer everyday questions, and better understand the world around you. It features web search to help you get recommendations, dig deeper into a topic, and stay connected with friends and family. If you just want to try it out, we’ve got some conversation starters to spark your search.
We’ve spent decades personalizing the user experience on our platform, and we’ve applied that philosophy to Meta AI to make it even more personal. You can askMeta AI to remember certain things about you (like that you like traveling and learning new languages), and it can pull out important information based on context. Your Meta AI assistant also uses the information you share across Meta products (like your profile and what you like or interact with) to provide more relevant answers to your questions. Personalized replies are available now in the US and Canada. If you’ve added your Facebook and Instagram accounts to the same Accounts Center, Meta AI can also use information from both accounts to give you a more powerful, personalized experience. Existing Meta View users can continue to manage their AI glasses through the Meta AI app—all paired devices, settings, and media will automatically move to the new Devices tab once the app is updated. From AI glasses to desktopThe web version of Meta AI has also been upgraded. It features voice interaction and a new Discover push, just like what you see in the app. This continuity between the Meta AI app, AI glasses, and the web helps deliver more personalized AI, so you can get the services you need, when and where you want.
You control your experienceVoice is the most intuitive way to interact with Meta AI, and the Meta AI app is designed to help you easily start a conversation with the touch of a button—even if you’re multitasking or busy. If you want voice to be on by default, you can control the “Ready to talk” feature in settings.