
AI and Memory: Lessons from GameStop and Google on the Future
Article Summary
📖 9 min readThis article explores the paradox of AI that is everywhere yet remembers nothing. Through the examples of GameStop betting on eBay's data and Google rolling out Gemini, it reveals that the true breakthrough in artificial intelligence lies not in model power alone, but in the ability to understand users through rich, personalized data.
Key Points:
- Today's AI, despite its raw power, suffers from a critical lack of memory and contextual understanding of individual users.
- GameStop's potential acquisition of eBay is a data infrastructure play — 132 million active buyers and decades of behavioral patterns to train a recommerce-focused AI.
- The future of artificial intelligence will not be decided by LLM performance alone, but by the richness and relevance of contextual data.
- The global recommerce market, estimated at $276 billion by 2028, illustrates the immense potential of AI to personalize the customer experience.
- The companies that will win at AI are those that master the collection and analysis of fine-grained behavioral data to anticipate user needs.
The Great AI Paradox of 2025
$55.5 billion. That’s the sum GameStop is reportedly considering putting on the table to acquire eBay. A brand everyone had written off — the brick-and-mortar video game retailer in an all-digital world — is reinventing itself as an Amazon challenger. Meanwhile, Google is injecting Gemini into its smart speakers so that your voice assistant can “finally understand complex requests.”
Two seemingly unrelated headlines. But ask yourself the right question: what is still missing from all these AIs, whether they cost $55 billion or run inside a $100 speaker?
Memory. Context. Knowledge of you.
Here’s where it gets interesting.
GameStop Is Playing the AI Game Without Saying So
Everyone read the GameStop/eBay story as an e-commerce play. Wrong read.
GameStop isn’t buying eBay to sell used PS5s. It’s buying a data infrastructure — 132 million active buyers, transaction histories going back to 1995, and behavioral patterns of surgical precision. This is a bet on AI applied to the secondhand commerce market.
The logic is straightforward: Amazon owns new goods. The refurbished and secondhand market is exploding — the global recommerce market is projected to reach $276 billion by 2028 according to multiple industry estimates. To win there, you need to predict, personalize, and anticipate. In other words: you need AI trained on rich, domain-specific data.
Ryan Cohen, GameStop’s CEO, isn’t making a nostalgic bet. He’s making a data bet.
What these analyses never tell you: the real AI war is not a war of models. It’s a war of context. Whoever owns the richest context wins. Not whoever has the most powerful LLM.
Google Home + Gemini: One Step Forward, One Problem Untouched
On the other end of the spectrum, Google announces that its Google Home smart speakers and displays now integrate Gemini to “handle more complex requests.” Concretely: you can ask your assistant to chain multiple actions, reason through ambiguous instructions, and manage multi-step scenarios.
That’s real. That’s useful. And it’s not enough.
Let’s flip the question: what counts as a “complex request” for Google Home? “Turn off the living room lights and start Netflix on the TV.” Or: “Remind me to call the plumber tomorrow morning if I haven’t already set a reminder for it.”
Technically impressive. But Gemini in your speaker still has no idea you’re a freelancer, that you have three clients who drain you, that you do your best work in the morning, or that your priority project changes every week. It doesn’t know who you are.
“Artificial general intelligence is not the problem. Contextual artificial intelligence — the kind that actually knows you — is the real challenge of the next five years.” — Nova-Mind perspective
Every session starts from zero. Every day, you’re a stranger to your own assistant. That’s the real problem neither Google nor Amazon has solved.
Memory: The Missing Link Everyone Ignores
Paying close attention to the details reveals something that major announcements carefully conceal: the AI giants are building increasingly powerful models on increasingly amnesiac foundations.
GPT-4, Gemini, Claude — all extraordinarily capable within a single session. All completely unable to remember that you had this conversation yesterday. That this client’s name is Smith and he always pays late. That you prefer deliverables as PDFs, not Google Docs.
That’s not a minor detail. That’s the productivity gap.
Do the math: if you spend 15 minutes a day re-contextualizing your AI (re-explaining who your clients are, what the current project is, what your preferences look like), that’s 75 minutes per week wasted. Over 60 hours per year spent feeding a memory that retains nothing.
The major platforms have no incentive to fix this problem. The more you re-contextualize, the more tokens you consume. The more tokens you consume, the more you pay. AI amnesia is a business model.
What This Means Concretely for Freelancers and Teams
Let’s look at it from another angle: imagine that every morning, your human assistant shows up to the office with no memory of what happened the day before. You explain everything again. Every day. No exceptions.
How long would you keep them around?
That’s exactly the situation with 95% of AI tools on the market today. And it’s especially painful for three types of users:
The multi-client freelancer who juggles between 5 and 15 clients with radically different contexts. Every time they open Claude or ChatGPT, they have to start over. Who the client is, what the project involves, what the constraints are, what tone to use.
The digital agency managing dozens of simultaneous projects loses a staggering amount of time to repetitive “AI briefings.” The AI is powerful but never capitalizes on prior history.
The solopreneur scaling up who starts delegating increasingly strategic tasks to AI hits the wall: the AI doesn’t know their clients, their active deals, or their working habits.
The solution isn’t a bigger model. It’s a persistent memory architecture. pgvector on Supabase, semantic embeddings, automatic contextual retrieval — that’s what transforms a generic AI into an assistant that actually knows you.
Three Actionable Lessons for Your AI Stack Today
My expert take, after architecting this memory problem inside Nova-Mind: don’t choose your AI tools based on their raw capabilities. Choose them based on their ability to retain and use your context.
Lesson 1: Audit your re-contextualization time
For one week, note every time you explain to your AI who a client is, what a project involves, or what your preferences are. Multiply by 52. The number will surprise you — and it justifies investing in tools with persistent memory.
Lesson 2: Distinguish session AI from relationship AI
A session AI (ChatGPT without plugins, Claude without memory) is perfect for one-off tasks. A relationship AI — one that accumulates context about your clients, projects, and patterns — becomes necessary the moment you have more than 3 active clients or more than one simultaneous project.
Lesson 3: Private data is a competitive advantage
GameStop understood this with its 132 million eBay users. At your scale, your client data, project histories, and work preferences are your competitive edge. An AI that knows them — and keeps them private — is worth infinitely more than a generic ultra-powerful model.
The Real Race Isn’t the One You’re Being Shown
GameStop raising $55 billion. Google upgrading its speakers. OpenAI shipping GPT-5. These announcements dominate the headlines.
But the real race — the one that will determine which tools survive in your workflow 18 months from now — is the race for contextual memory. Who will build the assistant that truly knows you? Who will secure your data without monetizing it? Who will transform AI from an impressive gadget into a reliable work partner?
Experience has taught me one thing: the tools that win aren’t the most spectacular ones. They’re the ones that integrate so deeply into your daily life that you can no longer work without them.
Gemini in a Google speaker is impressive. An assistant that remembers your client Martin Smith prefers Thursday meetings, that his budget is $3,500/month, and that he hates reports longer than two pages — that’s indispensable.
The difference between the two is exactly what we’re building at Nova-Mind. Permanent memory via pgvector, persistent context per client and per project, data hosted privately on your own Supabase instance. Not magic — architecture.
Conclusion: Choose Your AI Battles Wisely
The ambient noise around AI is deafening. Every week, a “revolutionary” announcement. Every month, a new model that “changes everything.”
Filter for signal. The announcements that matter to you aren’t the ones about billions raised or benchmarks beaten. They’re the ones that answer one simple question: will this tool remember my work tomorrow morning?
If the answer is no, it’s a session tool. Useful, but limited.
If the answer is yes — and your data stays private, and the tool fits into your existing workflow — then you might have something worth your time.
Try Nova-Mind and experience what it feels like to work with an assistant that actually remembers your clients. €39/month. Private data. Permanent memory. No bullshit.