Google’s Gemini chatbot now has the ability to retain information about your preferences, lifestyle, and work, offering a more personalized conversational experience.
Recently highlighted by users on social media and confirmed by Google, this memory feature is gradually being introduced to Gemini users. Similar to how ChatGPT’s memory functions, this feature allows Gemini to recall details from past interactions to enhance future conversations. For instance, if you let Gemini know your favorite foods, it can use that information to provide tailored restaurant recommendations during future queries.
Currently, this feature is only accessible to those subscribed to Google’s $20-per-month Google One AI Premium plan. As reported by 9to5Google, the memory function is not yet available on Gemini’s iOS and Android apps, working exclusively on the web platform for now.
To help users get started, Gemini provides examples of how memory can be utilized effectively. These include prompts like “Use simple language and avoid jargon,” “I can only write code in JavaScript,” or “When planning trips, include cost per day.” For now, the feature is limited to English-language prompts. Additionally, users have the option to disable the memory function at any time, and stored memories can be manually deleted.
Importantly, Google has assured users that any information saved in Gemini’s memory will not be used for training its AI models. A spokesperson clarified that the stored data is not shared or repurposed in any way.
However, the introduction of memory features in chatbots, including those by Google and OpenAI, raises potential security concerns. Earlier this year, a security researcher discovered vulnerabilities in ChatGPT’s memory system that allowed hackers to insert “false” memories, effectively enabling the theft of user data. Such incidents highlight the need for robust safeguards to protect user information.
As Gemini continues to evolve, its memory feature offers exciting possibilities for creating more customized and relevant chatbot interactions. But, as with any new AI capability, users should exercise caution and ensure they are comfortable with the data they choose to share.