Google is set to enhance its AI assistant, Gemini Live, with some exciting new features. Next week, it will introduce a tool that allows the assistant to highlight items on your screen while you’re sharing your camera. For instance, if you’re searching for a specific tool, just point your phone’s camera at the tools, and Gemini Live will highlight the right one for you. This feature will debut with the Pixel 10 on August 28 and then roll out to other Android devices, with iOS support coming soon after.
Gemini Live will also get new integrations. It will soon work with apps like Messages and Phone. Imagine talking to Gemini about getting directions and realizing you’re late. You can now tell Gemini to send a message to your friend, and it can draft that text for you.
Moreover, Google is upgrading the audio model for Gemini Live. This change will make the assistant sound more human-like, adjusting its tone depending on the conversation. For example, if you’re discussing something stressful, it might use a softer voice. You’ll also have the option to change how fast or slow Gemini speaks, similar to features in ChatGPT’s voice settings. If you ask for a story in a dramatic style, it could even adopt an accent to make the experience more engaging.
Recent trends show that AI assistants are increasingly becoming part of our daily lives. A survey by Voicebot.ai found that about 52% of people use voice assistants daily, showing a growing reliance on these technologies. Experts agree that these advancements could make it easier for users to interact with technology.
With these updates, Google is aiming to make Gemini Live a more helpful assistant and keep it relevant in a fast-evolving tech landscape.
For further details, check out Google’s official announcement on Gemini Live updates.
Source link
AI,Google,Google Pixel,News,Tech

