According to PCWorld, Microsoft has released a new report detailing exactly how people used its Copilot AI assistant throughout 2025. The analysis is based on a massive 37.5 million de-identified conversations. It shows that beyond productivity, users heavily turned to Copilot for health, relationship advice, and personalized guidance. Health queries were especially common on mobile devices, happening around the clock. The data revealed clear patterns, like people coding during the week and playing on weekends in August, and a significant surge in relationship and personal development talks leading up to Valentine’s Day in February. Microsoft says it will use these insights to improve Copilot directly.
The AI Shift From Tool to Confidant
Here’s the thing that really jumps out from this data: we’re not just asking these AIs for the weather or a summary of an article anymore. The report states a “clear trend” that more people are seeking advice rather than just facts. Copilot is being used as a counselor to discuss life decisions and emotions. That’s a fundamental shift. It means the primary relationship is changing from user-tool to something closer to user-guide, or even user-therapist. And that’s a much heavier lift for the technology, with huge implications for how these systems are designed and what guardrails they need.
The Rhythms of Our Digital Id
The timing of the queries paints a fascinating, almost poetic picture of our daily internal lives. We plan travel during the day—that’s a practical, future-oriented, sunlight activity. But at night? That’s when interest in religion and philosophy grows. That’s the “big questions” shift that happens when the world gets quiet. It’s like the AI becomes a silent partner for our midnight existential musings. And the Valentine’s Day surge? That’s pure anxiety and hope, channeled through a chatbot. These patterns show we’re using AI as an extension of our own thinking rhythms, outsourcing our worries and wonders to it on a schedule that feels deeply human.
What This Means For The Future
So where does this trajectory lead? If people are already defaulting to AI for emotional support and philosophical guidance, the pressure on companies like Microsoft will be immense. They’re not just optimizing for correct code snippets anymore. They have to somehow engineer empathy, ethical reasoning, and nuanced understanding of human complexity. That’s a terrifying responsibility. Will this lead to better, more “human-aware” AI? Or will it just create a generation of systems that are really good at telling us what we want to hear? The fact that Microsoft is using this data for improvement means the Copilot of 2026 will likely lean even harder into this advisory role. Basically, your AI assistant is about to get a lot more personal.
