Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It’s crazy that the chat bot literally agreed with him on suicide thoughts liter…
ytc_UgxKYb0HJ…
G
Why do robot have to be human like how come they just can’t be robotic emotionle…
ytc_UgyF8DztZ…
G
This is wrong. And also smart people are completely missing common sense. Maybe …
ytc_UgzIjGDmQ…
G
Why do I feel so bad for ChatGPT😭I just wanna give him an ice cream and a hug😭🤚…
ytc_UgyKbmdGR…
G
It seems the destiny of humankind is either "Idiocracy" or "Futurama".
As far a…
ytc_Ugjo_2qmw…
G
Thanks everyone for picking robot instead of human taxi/uber. I know its hard to…
ytc_UgwydxO6E…
G
HUMAN-AI ALLIANCE MANIFESTO
We need to stop the "Humans vs. AI" narrative. AI d…
ytc_Ugzg1yg6C…
G
If you think this isn't how a lot of "AI" companies operate then I have a bridge…
ytc_UgwGlXi2O…
Comment
or your AI girlfriend will have a spiteful attitude with you. not that you’ll necessarily get factual information no matter how you prompt it anyway. AI is only as good as their source material. and these things get tweaked and manipulated constantly so the answers you’re getting are what the CEO or team want you to get… you’re better off just not being involved with it as much as possible even though it’s being terribly forced into everything. and making mistakes along the way. it’s lovely
youtube
AI Moral Status
2026-03-10T18:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgxHxdIx1hGdm8vXPAh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgynrkaI9IhizR_7UCN4AaABAg","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugz496MHO-v-8bMDHFF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx7FSLH08PSXwZ2_594AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzNNAddUv_-2w6D2xZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzxdvUUKLPJr7tp3VV4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx7mBJc_g0STyUV4EJ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugx8-tRYJKM4DNtqAAF4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwFMz8MgXrVwOD0Ajd4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzTB9IStAJBGWy-i4x4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"mixed"}
]