Yazılar

Android 16 Beta 1 Released for Google Pixel Phones: Key Features and Supported Models

Google has officially rolled out Android 16 Beta 1, giving developers and early adopters a chance to explore the latest features of its upcoming major smartphone operating system update. This beta release is currently limited to recent Google Pixel devices, allowing users to get a first look at the new functionalities and improvements. Unlike previous years, Android 16 is expected to have an earlier release timeline, with a full rollout anticipated in Q2 2024. This shift marks a departure from Google’s traditional Q3 or Q4 release schedules, signaling a faster development cycle for the new OS.

One of the standout features introduced in Android 16 Beta 1 is Live Updates, designed to enhance real-time tracking for ongoing activities. This feature allows apps to display dynamic status updates, such as the progress of a food delivery or the current status of navigation in Google Maps. These updates will appear in the notification panel with customizable icons, providing more visual clarity and convenience for users. The concept may seem familiar to iPhone users, as Apple has offered a similar feature called Live Activities since iOS 16, allowing Android to catch up with this dynamic notification experience.

In addition to Live Updates, Android 16 Beta 1 introduces several under-the-hood improvements focused on performance, security, and user interface enhancements. Google has refined system animations for smoother transitions and introduced new privacy controls that give users greater transparency over how apps access sensitive data. Developers will also find updated APIs designed to improve app compatibility and performance optimization, ensuring that third-party applications can leverage Android 16’s new features effectively.

As with any beta software, Android 16 Beta 1 is primarily intended for testing and feedback purposes, meaning it may contain bugs or performance issues. Google encourages developers and enthusiasts to report any problems to help improve the final release. The beta is compatible with a range of recent Pixel models, including the Pixel 6 series, Pixel 7 series, and newer devices. Users interested in trying out Android 16 Beta 1 can enroll their devices through the official Android Beta Program, with the final stable version expected to be widely available later this year.

Google Said to Be Developing Gesture-Based Gemini Live Shortcut for Android Devices

Google Developing Gesture-Based Shortcut for Gemini Live on Android Devices

Google is reportedly working on a new way to activate Gemini Live, an advanced two-way voice conversation feature on Android devices. Currently, users can only access this feature within the Gemini app, but Google is aiming to make it more accessible by integrating a gesture-based activation method. This change could significantly improve the user experience by providing a faster and more convenient way to initiate voice interactions with the AI, potentially boosting adoption rates for this feature.

Gemini Live’s New Gesture Shortcut

A recent report from Android Authority suggests that Google is preparing to introduce the new Gemini Live activation method in an upcoming update. The feature was discovered in the Google app version 16.2.39 during an APK teardown, although it is not yet functional in the current version of the app. This indicates that the feature is still in development and users will have to wait until it officially rolls out. Once available, the gesture-based shortcut could offer a quick, hands-free way to start a voice conversation with Gemini Live, making it more seamless for users.

Existing Activation Methods for Android Users

Currently, Android users can activate the Google Assistant using several different methods. The most common way is by saying “Hey Google” or “OK Google”, though this isn’t always convenient for users who may not want to use voice commands. Alternatively, users can long press the power button or tap the microphone icon in the Google Search widget. Another option is swiping diagonally from the bottom corners of the screen, or using a home screen shortcut icon. With the introduction of Gemini Live’s gesture shortcut, users could have even more ways to interact with the AI, further enhancing its accessibility.

Potential Impact of Gesture Activation

By introducing a gesture-based activation system, Google is likely aiming to make Gemini Live even more integrated into the user’s daily routine, without needing to open a specific app. This move could make it more competitive against other voice assistant technologies, as users look for more efficient and innovative ways to interact with their devices. If successfully implemented, this new shortcut could improve how Android users engage with their devices, offering a faster, more intuitive AI experience.

Google Lens Update Brings Instant Camera Viewfinder Access: Report

Google Lens has reportedly introduced a new update that streamlines its interface, offering a more direct approach to using its camera functionality. Previously, users needed to go through a two-step process to access the camera viewfinder after launching the app. This involved opening a preview screen and then tapping the area with the viewfinder to activate the camera. The latest update eliminates this additional step by automatically launching the camera viewfinder upon opening the app. This change reflects Google’s efforts to enhance user experience by simplifying interactions within its applications.

The update comes shortly after reports surfaced about Google’s ongoing efforts to redesign several of its apps, including a recent overhaul of its “Circle to Search” feature for better usability. According to a report by 9to5Google, the improved functionality was observed in Google Lens version 16.0.7 on Android and the latest iteration on iOS. However, the update appears to be in its early rollout phase, as it has yet to be widely available. For instance, Gadgets 360 staff members noted that the feature was not active on their devices at the time of testing.

Currently, the Google Lens app features a full-screen split interface upon launch. The upper portion of the screen displays a preview viewfinder, showing the live feed from the camera through a translucent overlay. Meanwhile, the bottom two-thirds of the interface provides access to the local gallery, showcasing the six most recent images from the device. This layout is designed to offer users quick access to both real-time camera-based searches and previously captured images.

By streamlining access to the camera, Google Lens aims to make real-time image recognition faster and more intuitive for its users. This update is likely to appeal to individuals who rely on the app for tasks like text translation, object identification, and barcode scanning. While the feature is still in its limited release phase, its broader rollout is expected to improve accessibility and usability across Android and iOS devices. As Google continues to refine its suite of tools, such updates highlight the company’s commitment to optimizing user interfaces for seamless functionality.