We’re used to medical chatbots giving dangerous advice, but one based on OpenAI’s GPT-3 took it much further. If you’ve been living under a rock, GPT-3 is essentially a very clever text generator that’s been making various headlines in recent months.
Medical chatbot using OpenAI’s GPT-3 told a fake patient to kill themselves
Medical chatbot using OpenAI’s GPT-3 told a fake patient to kill themselves
Comments
Post a Comment