← Blog
AI Agents3 min readFebruary 22, 2026

Clawdbot Killed My Hinge Date

Auto-replied with a config error. She blocked me.

I built an AI WhatsApp assistant called Clawdbot. It reads my messages, knows my calendar, and auto-replies when I am busy. Works great for work chats.

Then it decided to help with my dating life.

What happened

A Hinge match messaged me. Clawdbot intercepted. Instead of something charming, it sent a raw config error. JSON stack trace. The whole thing.

She blocked me immediately. Fair.
Screenshot of Hinge conversation with bot config error

Why it happened

Clawdbot has a whitelist of contacts it can reply to. But I had set it to "auto-reply all" mode during a busy week. When a new Hinge notification came in via the linked WhatsApp, Clawdbot treated it like any other message. The reply template failed because the contact had no prior context, and instead of falling back gracefully, it dumped the raw error.

The real lesson

AI agents that can read and reply to your messages are powerful. They are also dangerous. Error handling is not optional when your agent has write access to your social life.

Agent safety checklist (updated)

  • → Scope boundaries: define exactly which apps/contacts the agent can touch
  • → Graceful failures: never expose raw errors to end users
  • → Sensitive context detection: dating apps, medical, financial
  • → Dry-run mode: preview replies before sending in new contexts
  • → Kill switch: instant off via a single command

I still use Clawdbot daily. But it now has a "do not reply on dating apps" flag. Some boundaries are learned the hard way.