Yazılar

Meta Unveils AI App Featuring Voice Conversations and a Social Discovery Feed

Meta Launches Standalone AI App with Voice Conversations and Social Discovery Feed

Meta has introduced a standalone app for its artificial intelligence (AI) chatbot, Meta AI, which is now available for download on both Android and iOS devices. The new app combines AI-powered interactions with a social element, offering users a unique experience. One of the standout features of the Meta AI app is the social Discover feed, where users can explore posts and images shared by others. The app also introduces a voice mode, allowing users to engage in two-way verbal conversations with the chatbot. However, this voice feature is currently available only in select countries.

In an official newsroom post, Meta unveiled the new AI app and highlighted its key features. This launch comes after CEO Mark Zuckerberg revealed the company’s plans to develop a dedicated AI app. Powered by the Llama 4 AI model, the Meta AI app enters a competitive market alongside other AI platforms like ChatGPT, Gemini, Grok, and Claude. The app is designed to provide more than just text-based interactions, offering users a comprehensive experience that integrates social media-like features.

The social aspect of the app is evident through the Discover feed, where users can share their interactions with Meta AI, including prompts, responses, and even images generated by the AI. Users are encouraged to engage with the content, liking, commenting, or remixing posts to create new prompts. Importantly, Meta ensures user privacy, stating that no content is shared to the feed unless the user chooses to post it. This emphasis on user control allows for a more personalized and secure experience.

To access the Meta AI app, users must sign in using a Meta account, which can be linked to either their Instagram or Facebook profile. Once logged in, the app can access information like the user’s profile, content they’ve engaged with, and past conversations with Meta AI. This integration with Meta’s broader ecosystem aims to create a seamless and connected experience across various platforms.

Meta Reportedly Testing In-House AI Training Chipsets for the First Time

Meta has reportedly started testing its first in-house chipsets designed for training artificial intelligence (AI) models. These processors, developed under the Meta Training and Inference Accelerator (MTIA) program, mark a significant step in the company’s effort to reduce reliance on third-party chip suppliers. A limited number of these custom chips have been deployed for initial testing to evaluate their performance and efficiency. If the tests yield positive results, Meta is expected to scale up production and integrate these chipsets into its AI infrastructure.

According to a Reuters report, Meta has collaborated with Taiwan Semiconductor Manufacturing Company (TSMC) to develop these AI-focused processors. The company has reportedly completed the tape-out stage—one of the final steps in chip design—indicating that the project is moving closer to full-scale deployment. While testing is still in its early stages, Meta’s move highlights its commitment to developing proprietary AI hardware, potentially giving it more control over performance optimization and cost management.

This is not Meta’s first venture into AI chip development. Previously, the company introduced inference accelerators designed specifically for AI inference tasks. However, until now, Meta lacked in-house chipsets dedicated to training large-scale AI models such as its Llama family of large language models (LLMs). With these new processors, the company aims to enhance its AI capabilities while reducing dependence on external chip manufacturers like Nvidia and AMD.

If Meta successfully scales up production of its custom AI chipsets, it could lead to more efficient AI training, improved model performance, and lower operational costs. The move aligns with a broader industry trend where major tech firms, including Google and Amazon, are investing in custom AI chips to stay competitive in the rapidly evolving AI landscape. As Meta continues its AI hardware push, further details about its chip performance and deployment strategy are expected to emerge in the coming months.

WhatsApp Beta Unveils AI-Generated Group Icons and Meta AI Widget

WhatsApp is rolling out two exciting new features for beta testers on Android, aimed at enhancing user experience and integrating AI into its messaging platform. These updates are part of the company’s ongoing effort to make its app more intuitive and accessible. The first feature is an AI-driven group icon generator, which allows users to create unique group images using Meta AI. This will enable users to personalize their group chats even further, making it easier to set the perfect group icon that reflects the group’s theme or mood. It’s a fun way to refresh group visuals without having to manually search for images.

In the latest WhatsApp beta version (2.25.6.10), some testers have access to a new option that appears when they tap on the pencil icon on a group’s existing image. The new “Create AI Image” option lets users generate entirely new images using Meta’s AI technology, offering a quick and easy way to create custom icons based on specific prompts. This feature is still in the testing phase, so not all users will have access yet, but it’s expected to be rolled out to a wider audience in future updates.

Another significant update that WhatsApp beta users are experiencing is the introduction of a Meta AI widget. This new feature, available in the 2.25.6.14 beta update, allows users to add a shortcut to their home screen, providing direct access to the Meta AI chatbot. With this widget, WhatsApp users can interact with Meta AI without needing to open the app first, making the process much more convenient. It’s designed to streamline how users engage with the chatbot, helping them quickly get answers or support.

Both features are part of WhatsApp’s broader strategy to integrate more AI-powered tools into its platform, enhancing user experience while keeping the app fresh and engaging. While these features are currently limited to beta testers, they are expected to roll out to all users on both Android and iOS in the near future. This integration of AI tools could significantly change how users interact with their WhatsApp groups and use the app in general, making it an even more powerful communication tool.