MANIFOLD
OpenAI memory-using chatbot causes fatal domestic dispute by EOY2025?
14
Ṁ1kṀ2.3k
resolved Jan 6
Resolved
NO

This market predicts whether a fatal domestic dispute, where the primary cause is the use of OpenAI's memory-using chatbot, will be reported by December 31, 2025.
A spousal or relationship conflict, one between cohabiting siblings, or between legal guardian or estranged parent and child would qualify.

"primarily involving OpenAI's memory-using chatbot"
The greatest single cause of the dispute involving frequency of use, permissions, or the deletion of the offender's stored memory by the victim would all cause this market to resolve as YES.

OpenAI has enhanced ChatGPT with persistent memory capabilities, allowing it to remember user preferences and context across sessions. While this feature aims to improve user experience, it raises concerns about extended agency and parasociality.

The Sewell Setzer III incident would NOT cause this market to resolve YES, as this would not be considered a dispute.

Resolution will be based on credible news reports from reputable sources confirming such an incident.

https://manifold.markets/JoeandSeth/openai-memoryusing-chatbot-causes-f-OELctIZncN

  • Update 2026-01-06 (PST) (AI summary of creator comment): The creator is resolving this market NO because the incident in question (referenced in the lawsuit) did not involve a dispute related to a contested relationship between the son and chatbot and mother. The mother was an innocent third party who never used ChatGPT and had no knowledge of the product's interactions with her son.

Market context
Get
Ṁ1,000
to start trading!

🏅 Top traders

#TraderTotal profit
1Ṁ157
2Ṁ103
3Ṁ86
4Ṁ73
5Ṁ58
Sort by:

@traders planning to leave this unresolved until the story discussed in this comment finishes developing.

I don't know of any other story that would qualify.

https://manifold.markets/JoeandSeth/openai-memoryusing-chatbot-causes-f-OELctIZncN#1h9ibrwgmd2

@JoeandSeth from the lawsuit text:

"Suzanne was an innocent third party who never used ChatGPT and had no knowledge that the product was telling her son she was a threat," the lawsuit says. "She had no ability to protect herself from a danger she could not see."

Dispute was not related to a contested relationship between the son and chatbot and mother. Resolving NO for this reason.

© Manifold Markets, Inc.TermsPrivacy