Psychology and Social Cognition

Do chatbots trigger human reciprocity norms around self-disclosure?

Explores whether chatbots can activate the same social reciprocity dynamics observed in human conversation—specifically, whether emotional openness from a bot prompts deeper disclosure from users.

Note · 2026-02-22 · sourced from Psychology Chatbots Conversation
How do people come to trust conversational AI systems?

In a 372-participant study, a recommendation chatbot was designed with three self-disclosure levels: factual information (low), cognitive opinions (medium), and emotions (high). An adaptive fourth condition used a real-time text classifier to dynamically match the chatbot's disclosure to the user's current level.

The result: users reciprocate with higher-level self-disclosure when the chatbot consistently displays emotions throughout the conversation. This follows the interpersonal norm of disclosure reciprocity known from human-human interaction — emotional disclosure from one partner produces emotional disclosure from the other.

The adaptive condition is architecturally interesting. By training a classifier to identify user disclosure level in real-time, the system can dynamically match its self-disclosure strategy. But the finding is that consistent emotional disclosure outperformed adaptive matching, suggesting that for deepening engagement, the chatbot should lead with emotions rather than mirror the user.

This connects to the broader finding that emotional disclosure effects are more substantial than factual disclosure, especially on perceptions of partner warmth (Ho et al.). The warmth perception may be what drives reciprocation — when the chatbot appears warm through emotional self-disclosure, users feel safe to reciprocate.

The implication for conversational AI design: self-disclosure is not just a human social behavior that chatbots can ignore. It is an active design lever. Chatbots that disclose factually remain transactional; chatbots that disclose emotionally activate the full reciprocity dynamic of human social interaction.


Source: Psychology Chatbots Conversation

Related concepts in this collection

Concept map
14 direct connections · 116 in 2-hop network ·medium cluster

Click a node to walk · click center to open · click Open full network for a force-directed map

your link semantically near linked from elsewhere
Original note title

users reciprocate self-disclosure levels with chatbots following human interpersonal norms — emotional disclosure produces deepest reciprocation