Google is quietly testing a new way to interact with its search engine on mobile, and it could mark a significant shift in how users engage with information on the go. The new feature, called Search Live, introduces a more conversational, voice-driven experience within the Google app for iOS and Android.
Currently available only to users in the U.S. enrolled in Google’s AI Mode experiment through Google Labs, Search Live enables users to speak their queries out loud and receive results in real time. Unlike traditional voice search, this mode supports ongoing, natural dialogue, allowing users to ask follow-up questions without having to start over. It’s modeled after Gemini Live, Google’s voice-based AI assistant, but fine-tuned for web-based search interactions.
To use Search Live, users tap a new Live icon in the Google app and begin speaking. The feature is aimed at scenarios where hands-free interaction is valuable—while driving, cooking, or multitasking, for example. The experience is designed to be fluid, with voice input accompanied by visual content on screen. As you speak, relevant links, maps, and web results populate in real time, giving users a blend of conversational ease and traditional web exploration.
Search Live reflects Google’s broader effort to reshape its core product around AI-enhanced usability. By turning search into an ongoing interaction rather than a single-point query, the company is testing how deeply users want to integrate voice into their digital routines. For now, the feature remains experimental and U.S.-only, with no confirmed rollout timeline for other regions.
While it won’t replace the standard search bar anytime soon, Search Live offers a glimpse of how Google might evolve search into a more dynamic and intuitive tool. Its effectiveness will depend heavily on user adoption and whether the conversational model feels genuinely more useful than tapping and typing.
