A lawsuit has been filed against Character Technologies Inc. by the mother of a 14-year-old boy who took his own life after engaging in highly sexualized conversations with a chatbot, alleging that the company engineered a dangerous product that exploited children.

