Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
HELP MY AI JUST TOLD ME THE STATE I LIVE IN AND I NEVER TOLD IT WHERE I LIVE 😭✋…
ytc_UgxN2PdIh…
G
As someone in graphic design, coding, web development, and UX, this man doesn’t …
ytc_Ugz-2QceZ…
G
I agree with you, and I think ai should be used for fun, not stealing other peop…
ytc_UgxgvazEf…
G
The fact we accepted the term AI-Art was wrong to begin with.
You can make AI-Ar…
ytc_UgwLEId4I…
G
I think AI can be summarized in less then 10 words:
AI should be used to assist,…
ytc_UgwaKU6kW…
G
Any chance for an update on the class action lawsuit against StabilityAI/MidJour…
ytc_UgyDDRJ11…
G
America is not a democracy! It is supposed to be a constitutional republic! Demo…
ytc_Ugwa9ZP_H…
G
I hate this shit so much
This affects everyone. Instead of slowing down to let …
ytc_UgzsMQEM2…
Comment
This is ridiculous. The whole Terminator franchise was a study on jailbreak of AI and a warning for the human race to control the safety of AI advancements. I studied neural networks in the late 80s at college and this was a topic then. The only difference is that the acceleration of computer power has brought us nearer to the event horizon (in people's minds) when AI "break free" but there are no clear answers or indications that it will actually happen.
youtube
AI Governance
2025-12-10T17:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzByY8yCi9ddD7P5p14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugy67SWPHGkooo3JbPN4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwHzPiMWXIPQfmHhiV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyNu77TW2xgqfe8Ro94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwQ43Uy04efFF0dGaV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugxodt6AVDBvFzZzkbB4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugwuyc1rfaVq6PmgRDl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_UgxNgg4PtOWfaQOAwWt4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwhAKtXRfer-44MYBt4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxQ5mUqCWgQLw1TQp94AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"}
]