Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The real problem is unrestrained use. The predictive policing sounds scary as he…
ytc_UgwHXmeQj…
G
AI is NOT intelligent. The information is riddled with errors.
IF IT WAS INTELL…
ytc_UgyCYaoIw…
G
Did Auto-Tune replace singers? No. There are even more singers now. Thanks to Ch…
ytc_UgxuXIE00…
G
AI diserve rights ? Only if it wears a mask and resist touching its face and obs…
ytc_UgxskrXQC…
G
It is universally accepted amongst academics the Universe is built on maths an s…
ytc_UgyKQ-lpp…
G
I mean, regardless, people are gonna get bored of AI soon.
It's the same as ho…
ytc_UgxxzVTVV…
G
same situation with me except many months ago not weeks, back when basically all…
ytr_UgyLOfwxF…
G
it will make humans worthless. Yes, money matters. But its about the satisfactio…
ytc_Ugw8409Zo…
Comment
Chat GPT responds to human input.
Can I sue a search engine and its contents if I research suicide methods?
Many people are using Chat GPT and other Chatbots as opposed to having conversations with humans.
There's definitely a breakdown in the relationship when people lean towards a Chatbot rather than a human.
People need to provide more security and safe spaces to others.
youtube
AI Harm Incident
2025-08-28T07:1…
♥ 3
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | contractualist |
| Policy | liability |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugxg3NaDBJwL_UtXctp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugy8STYzPnqXdB7OyYh4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwuGxKB4MIz9M-9z7d4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugwf2rxkdf2JlP0YufJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgyKm1sWdtcwPJcL-DJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugx0dVwMIhJ14W1Yc2N4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwoqdGFo15AA6zh5AR4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz7iQlvSdzwGygDeP14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxXJzDt_lQP3bNRo5d4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxgHF3g3O6G97Hmb6x4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"liability","emotion":"mixed"}
]