
Christmas, Calm Systems, and the Year AI Learned Some Manners
By Michael Snow, CEO, InoGen AI
Christmas has a way of sneaking up on technology teams.
One moment you're in back-to-back meetings discussing roadmaps and releases - the next, half the company is "offline until January," Slack goes suspiciously quiet, and even the most complex problems suddenly feel... less urgent.
That pause is useful.
Because when the noise fades, something became clear this year:
AI stopped trying to be the loudest voice in the room.
And honestly? It suits it better.

Earlier this year, many AI conversations felt like ambitious Christmas wish lists:
- Bigger models
- Faster responses
- "Can it do everything?"
By the end of the year, the teams making real progress were asking different questions:
- Can we trust where this answer came from?
- Can we trace it back to the source?
- Can the system explain itself - or gracefully say "I don't know"?
In other words: less magic, more manners.
The most successful AI systems weren't impressive because they talked a lot - they were impressive because they knew when to talk, what to retrieve, and when to stop.
(Yes, the technical readers will recognise this as retrieval, reasoning, guardrails, and a bit of orchestration - wrapped in festive paper.)