Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
mhm remember the cylons in Battlestar Galaktica
- it starts with slavery and th…
ytc_UgwhOkhQk…
G
We humans throw EVERY SINGLE YEAR MLLIONS OF TONS OF PLASTIC in the oceans, i th…
ytc_Ugxnk8-UZ…
G
I am not against AI, I think the biggest error comes from companies trying to f…
ytc_Ugwih-dMM…
G
There should be a code of ethics for AI. But unfortunately Asimov lived (or wrot…
ytr_UgysjfRUS…
G
I want AI to do things like dishes and laundry so I can do art and write. I'm so…
ytc_UgzD1TFzb…
G
I saw somebody somewhere once say, “I don’t want AI to draw for me so I have mor…
ytc_UgxEBqLg3…
G
I don’t think sentient AI would want to kill people. AI gains functionality thr…
ytc_Ugxxz8k-K…
G
I think the question of liability is one of the biggest problems with autonomous…
ytc_Ugz-AoU5m…
Comment
The problem causing the most suffering in the world is people working jobs they hate to pay for mortgages and taxes they which create a subconscious desire for the end of the world. We manifested AI and the reality upon us while watching doomsday movies about robots taking over the world 40-50 years. By the way we were sitting down to watch those movies to escape reality 😂
youtube
AI Governance
2025-11-27T06:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | mixed |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugxw4WQsLH94GNiE0RR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugyd9ZBgs2x8JOflNDZ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx2hb0k6DKJ67R_q-V4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwlQnwgtX9qvd2AWtN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyupmktZHf3aznpKMZ4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzNDcdJ4XY4f1dfbsh4AaABAg","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyVYnylAwEXyo5TJ4d4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgymnZ3RSaBvqY51Utt4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugw_GLRzsJpI8cv7jiB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzzRBBnWGNnnu6tMy54AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"fear"}
]