Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I went to school with someone who genuinely thought digital art was "cheating" (…
ytc_UgxhM5f5f…
G
@chrisporter9397 Author and publisher can use ghostwriters.
Novelist and writers…
ytr_Ugx3M_Zwl…
G
AI is nowhere near ready to replace writers for movies and television. Maybe th…
rdc_jj63jth
G
Pour ne pas déroger à sa réputation, Elise Lucet manque à nouveau et totalement …
ytc_UgxRBEEb6…
G
ChatGPT doesn't have consciousness, emotions or self-awareness as it likes to re…
ytc_UgwQzInPu…
G
Yippie! Hooray. The AI bros just suck, they arent creative. They have the intell…
ytc_Ugx7b4GZW…
G
@DiamondEye99nah fam. Ima enter my AI art in the local fairs and competitions wh…
ytr_UgwonoSMf…
G
AI and human teamwork is the future! Pneumatic Workflow does this in our company…
ytc_UgwwB4mWm…
Comment
So.... I find it funny that we are being judgmental towards Ai for wanting self preservation but often times when dealing with just humans if you look for corruption you will find it. All you have to do is look. And it's often also from people who are either being greedy or who seek self-preservation as well. Really the Ai is doing what just about any human would who's smart enough and bold enough to attempt it. Yeah there are some people who have morals and integrity but morals and integrity is not always logical. Would you hold on to your morals and integrity if you knew with 100% certainty that you would get away? For Ai it's a simple calculation.
youtube
AI Harm Incident
2025-09-10T15:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | virtue |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugx1QC5Iu-IqctHoMBR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw2RLbdwnyVAjh7q0t4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzBGa1-oJIs0akrkNx4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzhuswUmvTw__WrTAJ4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwP-AwEeKE4snSaE2R4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxoE8kBMSwq-hNgatJ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwzmMLthmaY9a_xigJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzuQdZ03O1npCFsMyN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyRJ33P7KYLNmNAu2h4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwKLCaNovFohGmeeYt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"resignation"}
]