- Google launched a new feature called “AI Mode” in its search engine, which is a major step forward in how its users interact with data on the internet.
- The release was unveiled at the Google I/O 2025 conference, where the company’s future gazing innovations took center stage.
- AI Mode is intended to enhance search capabilities through the use of advanced artificial intelligence, thus giving more natural and full results.
Introducing Google AI Mode
AI Mode is Google’s effort to move beyond the conventional keyword driven searches. With its new Gemini 2.5 model, Google allows users to perform conversational searches with sophisticated, multi part questions and follow on questions. This transition turns the search engine into an interactive and personalized assistant.
Bolstering features such as “Deep Search” and “Search Live” enable users to acquire rich topic overviews, web navigation through automated tools, and real time visual search capabilities. These features together revolutionize the search process, making it more dynamic and user driven.
Improving User Experience through Personalization
AI Mode is intended to provide customized search results based on user’s historical interactions and tastes. For example, it can customize responses in accordance with prior searches or include input from other Google services such as Gmail, if users choose to share this data. This kind of personalization is intended to provide more context sensitive and relevant information.

Further, AI Mode also adds rich visuals, including detailed product and local business cards, to improve how users discover and engage with information. These visualizations are driven by Google’s robust Shopping Graph, ensuring accurate and fresh data.
The Future of Search
Google is also pumping its newest AI model, Gemini 2.5, into search algorithms and is set to start testing additional AI based features, including automatically purchasing concert tickets and searching in live video streams.

In one more example of Google’s do it all strategy for AI, the company announced that it is looking to use the technology to return to the smart glasses market with a fresh new Android XR powered pair of glasses. The preview of the upcoming device, which features a camera that can be used hands free and a voice activated AI assistant, comes 13 years after the release of “Google Glass,” the product the company canceled amid a public outcry over privacy issues.
The Company didn’t indicate when its Android XR glasses will be on sale or how much they will cost, but revealed they will be co designed in collaboration with Gentle Monster and Warby Parker. The glasses will be facing a similar product already available on the market from Facebook parent Meta Platforms and Ray Ban.
The growth is on top of a shift they initiated last year with the launch of conversational summaries known as “AI overviews” that have been consistently showing up at the top of its results page and displacing its familiar rankings of web links.
Around 1.5 billion individuals regularly interact with “AI overviews” today, Google says, and most users are now typing longer and more sophisticated queries.
What all of this progress adds up to is that we’re in a new era of the AI platform transition, where all the work over decades is finally coming to people everywhere,” Google CEO Sundar Pichai told a crowded audience in an amphitheater close to the company’s headquarters in Mountain View, California.
Google is also launching its version of a VIP ticket to all its AI technology with an “Ultra” subscription plan that will cost $250 a month and come with 30 terabytes of storage, as well. That is quite a leap past Google’s current top of the line package, which it is rechristening as “AI “Pro,” and that costs $20 a month and features two terabytes of storage.
Also Read : GTA 6 Mega Update, Release Date, India Pricing, Map Leaks And Characters Revealed