Nigeria, April 14 -- The feature, still in development, would apply only to adult users and would require them to voluntarily select a family member, friend, or other trusted contact. According to details shared by the company, the aim is to create a pathway for human support when conversations suggest a user may be in distress.
How the system is expected to work
The proposed system would rely on automated detection of emotional signals in chat conversations. These signals may include repeated expressions of distress, mentions of harmful intent, or patterns of language that suggest emotional instability.
OpenAI has not published the exact thresholds that would trigger an alert, and it remains unclear how the system would separate genu...
Click here to read full article from source
इस लेख के रीप्रिंट को खरीदने या इस प्रकाशन का पूरा फ़ीड प्राप्त करने के लिए, कृपया
हमे संपर्क करें.