The support feels real, the rules lag.

In 2024 and 2025, mental health chatbots and AI assisted coaching tools moved from novelty to routine, showing up in app stores, workplaces, and private late night searches. People use them for anxiety spirals, breakup grief, and everyday stress, often because a human appointment is expensive or weeks away. The ethical questions are no longer theoretical. They are about what happens when software becomes a confidant, what data gets kept, and who is responsible when guidance goes wrong.



