Playing devil's advocate, many people die using all kinds of tools. It doesn't make the tools any less useful for people who use them responsibly.
That said, the idea that a pattern recognition and generation tool can be used for helping people with emotional problems is deeply unsettling and dangerous. This technology needs to be strictly regulated yesterday.
If you use 300 million people, 2 years and the US suicide rate of 15/100k/yr you'd expect about 90,000 people who'd used LLMs to kill themselves anyway so it's hard to conclude much from the dozen or people listed in the Wikipedia article. 90k deaths seems a lot to me - maybe there are possibilities for good LLM assistance/therapy to improve things?
That said, the idea that a pattern recognition and generation tool can be used for helping people with emotional problems is deeply unsettling and dangerous. This technology needs to be strictly regulated yesterday.