Most MCP demos wire LLMs to external data stores. That’s useful, but MCP is also a chance to give models perception — extra senses beyond the prompt text.
Six functions (`current_datetime`, `time_difference`, `timestamp_context`, etc.) give Claude/GPT real temporal awareness: It can spot pauses, reason about rhythms, and even label a chat’s “three‑act structure”. Runs locally in <60 s (Python) or via a hosted demo.
If time works, what else could we surface? - Location / movement (GPS, speed, “I’m on a train”) - Weather (rainy evening vs clear morning) - Device state (battery low, poor bandwidth) - Ambient modality (user is dictating on mobile vs typing at desk) - Calendar context (meeting starts in 5 min) - Biometric cues (heart‑rate spikes while coding)
Curious what other signals people think would unlock better collaboration.
Full back story: https://medium.com/@jeremie.lumbroso/teaching-ai-the-signifi...
Happy to discuss MCP patterns, tool discovery, or future “senses”. Feedback and PRs welcome!
rlupi•5h ago
I just finished some changes to my own little project that provides MCP access to my journal stored in Obsidian, plus a few CLI tools for time tracking, and today I added recursive yearly/monthly/weekly/daily automatic retrospectives. It can be tweaked for other purposes (e.g. project tracking) tweaking the templates.
https://github.com/robertolupi/augmented-awareness