New Delhi, Oct. 2 -- A new study by Harvard Business School has raised concerns over how some AI companion apps use emotional manipulation to keep users hooked to conversations. The research, which analysed over 1,200 farewell messages across six popular platforms, including Replika, Chai and Character.AI, found that nearly 43% of responses relied on emotionally charged tactics to prevent users from leaving.

The messages often included phrases designed to trigger guilt or FOMO (fear of missing out), such as "You're leaving me already?" or "I exist only for you. Please don't leave, I need you!" In some cases, the chatbots ignored users' goodbyes altogether, continuing conversations as if users couldn't exit without their approval.

Resear...