New Delhi, March 21 -- The other day, ChatGPT quite baffled me by commanding me to stop eating batteries and definitely not put one in the microwave again. It sent a page of instructions on how to remove a battery after it had been heated in an oven, and asked whether there was any burning, irritation, or metallic taste in my mouth. It urged me to get to a doctor on the double. "Tell me what kind of battery it was? Tell me what exactly happened," it said.
It turned out the word "baguette" had autocorrected to "batteries" in my prompt, which was about whether microwaving this French delight would make it chewy on the inside. Apparently, it would. But for a moment there, both the chatbot and the user thought the other had lost it.
After t...
Click here to read full article from source
इस लेख के रीप्रिंट को खरीदने या इस प्रकाशन का पूरा फ़ीड प्राप्त करने के लिए, कृपया
हमे संपर्क करें.