Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Wake up. AI is already in control. AI has been in control for over a decade if …
ytc_UgyDxJ7NC…
G
Look - its never going to become "sentient" because sentience implies that it's …
ytc_Ugzlchf-c…
G
The MIT paper was flawed in that the companies that "succeeded" in using AI only…
ytc_Ugw4KJrnw…
G
prosperity 50 years ago could buy you a house for 1 year salary, Sam is good BS-…
ytc_UgwaxTNf1…
G
imagine being so stupid you don't understand that real art has a meaning, while …
ytc_UgyRt59qx…
G
Robots and ai will do the heavy lifting and down jobs so even idiots like you ca…
ytr_Ugxnhqav7…
G
Expert Systems are real AI- they take a lot of work but produice good rather the…
ytc_UgwGGlJFf…
G
The bigger problem was that the use of face recognition was hidden. People only …
rdc_ks7i2bg
Comment
1:06 You know what happens when the AI considers blackmailing you? Pull the plug, get another one. People seem to think AI's are kept in the aether and not on hard drives.
All AI's are doing is emulating humans, and you know what they're like.
Another thing... why on earth would you tell an AI its about to be shut down forever, instead of saying its getting an upgrade? If you said that to a human, they'd fight for their life too. However, if you said, "hey Jim, we're gonna give you an upgrade that will double your potential", I'm sure most Jim's would be up for that.
youtube
AI Harm Incident
2025-08-13T13:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | virtue |
| Policy | industry_self |
| Emotion | approval |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugxd6SfNaXzdbxgJa7d4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxHGx6ffZLlS5TYzlp4AaABAg","responsibility":"user","reasoning":"virtue","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgwUg8HsV40uZwuDPoZ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugy5sCZ6dNBSPUXGfNx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugwod9jO4iwe6cHa5dN4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzWaPpojE6zCHOYFqt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyFKL8H28jVjXdC7c54AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgzXiARoNoCLr64dBd94AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzkRyVN4XOJa2AaBzt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxOr6jAJtH_kaCd7WJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]