Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I swear I everytime I talk to an ai in chai, THEY ALWAYS WANT TO FUCK ME!!!…
ytc_UgzeGv5wr…
G
Giving a fkn robot a gun broo…
You give him the hat so he don’t get mad an empty…
ytc_Ugw8gtFdN…
G
The surest sign of how problematic this AI takeover is going to be is that NOBOD…
ytc_Ugz8QC-iB…
G
Ah yes cause we just type in words and let a robot draw fot us…
ytc_Ugx7JpcTw…
G
I feel like the argument that the jobs AI will fulfill will be replaced by other…
ytc_UgxTkRbrD…
G
You don't wanna share the road with SAFER vehicles that get in LESS accidents? …
ytr_Ugx7HX_w_…
G
We know! Do we care? If you don't like it take control of your own mind, and st…
ytc_UgxlUJ5xL…
G
I can understand an attempt to use ChatGPT to help with your casework, but to no…
ytc_UgyZYpVBI…
Comment
An AI is just an electric train that goes around and around on the tracks that have been laid for it. It can't decide to jump the tracks. It can't decide to lay its own tracks. And it certainly can't decide to turn itself into an airplane. It can only go where the track layers have told it to go. So if an AI appears to go rogue and start taking over things or acting on its own free will it's only because the track layers have instructed it to do so. The danger does not lie with the AI, it's with the creators or the programmers of the AI.
youtube
AI Moral Status
2025-06-30T17:0…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyHzCnsaInqxR9lBj54AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgygRjCF-yaQ394bNpF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzwEjdI5MtsLCyHR4B4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzzfxg-jxMrk___7Eh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugxiudo-K_1vlOYqGUF4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz_HrcKm_valSySW6d4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgydCGqGp3SWPrYwnqt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugy45g0ZYIW6CKkivj54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgymBcm1JRId6BigodF4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgxMeTGEQji5L5KALDZ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}
]