top of page

Being Kind to Our AI: Why Saying “Please” and “Thank You” Matters in the Age of AI

  • sheharav
  • Apr 1
  • 3 min read

I have been using AI as a tool in all areas of my work and personal life, for awhile. In this new dynamic, I’ve found myself thinking about something deceptively small: saying “please” and “thank you” when I interact with AI.


Do the machines care? No.

But we should.


IDEO U’s thoughtful article, Why It’s Important to Say Please and Thank You to Robots, captured this well. These words may seem inconsequential to an algorithm—but they’re not inconsequential to us. They’re tiny rituals that shape our behaviours, our values, and even how we design the systems around us.


I recently completed the IDEO U AI and Design Thinking course, and this concept really resonated with me. The course emphasized that creating meaningful AI-powered experiences is not just about capability—it’s about empathy, intentionality, and the subtle human signals we bake into our designs.


🧠 It's Not About AI’s Feelings—It's About Our Humanity

Some argue that politeness toward AI is unnecessary—after all, it doesn’t feel. But this line of thinking misses something deeper: the way we interact with technology reflects who we are.


When we design, speak to, and engage with AI, we are also designing, modelling, and practicing the future.


IDEO U points out that saying “please” and “thank you” is a signal to ourselves and others about what we value in our relationships. It’s also a way to infuse emotional intelligence (EQ) into our technology interactions—a theme echoed in Fast Company’s article, From Design Thinking to Emotional Thinking.


💡 Reasons Why Kindness in our interactions with AI Matters


1. Politeness Keeps Us in a Collaborative Mindset

When we say “please,” we’re not just softening a request—we’re acknowledging that we’re in a dynamic of cooperation, not control.


This is essential in the age of Agentic AI, where systems can autonomously reason, plan, and even take initiative. Politeness keeps our mindset grounded in partnership—a valuable shift in any co-creative or co-piloted space.


In Fast Company’s article, this is described as moving from Design Thinking to Emotional Thinking—creating products that respond not only to our logic, but to our feelings and social cues.


2. Tone Trains Tomorrow’s Tech

The data we generate—our commands, prompts, tone, and structure—is shaping the models of the future. As generative AI continues to learn from interactions, what kind of tone do we want reflected back?


If millions of interactions are impatient, curt, or aggressive, that behaviour becomes normalized in our machines—and in our culture.


IDEO reminds us that these small acts of courtesy are a way to humanize our future tech, not to anthropomorphize it, but to “retain our humanity in the loop.”


🌀 So... Are We Teaching AI to Be Polite?

Not directly, but we are teaching ourselves how to remain human in a digital age.

By choosing kindness—even in seemingly unnecessary moments—we create a digital culture that reflects care, not just code.


✨ Final Thought

Saying “please” and “thank you” to AI won’t change the algorithm. But it might change us.


In a world where AI is everywhere, being kind isn’t obsolete—it’s essential.

Because the future isn’t just AI-powered. It’s human-shaped—and increasingly, AI-shaped too.


As we begin to imagine AI not only as a tools—but even as a customer making decisions, asking questions, and expecting service—the way we speak to AI today may influence how AI responds to us tomorrow.


When we train AI with empathy and respect, we create the potential for more intuitive, human-aware systems—on both sides of the conversation.

🌀 Coming next: What happens when AI becomes the customer? In the age of AI agents, politeness isn’t just for human users—it may be part of how AI evaluates us, too.

🔗 Note: This piece was inspired by insights from the IDEO U AI and Design Thinking course—a course I highly recommend for anyone designing with intention in the age of intelligent systems.

Comments


bottom of page