Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Seems like Mark Zuckerberg should put all those AI centers in his backyard and h…
ytc_UgztoA4GT…
G
Exactly. LLM's don't actually 'understand' what they output. They are essentiall…
ytr_UgzenVi1k…
G
I feel like it would make more sense if for the "I'm a real chef" segment had hi…
ytc_Ugy44hYoG…
G
Well, let us not forget that previously, Alex O Connor very cleverly steered the…
ytc_UgxEkOdP4…
G
Hmmm I don’t know. There are already safeguards to ChatGPT. It will not answer e…
ytc_Ugw27UMJU…
G
I find it hard to believe ppl thought this guy would be crazy, considering he wa…
ytc_UgxXOPusJ…
G
I love seeing economists of all people suffer from Sunk Cost Fallacy. Just give …
ytc_UgwajloRe…
G
@TylerIsCoolBro It's Hilarious that you think they are ALL 100% AI... You can L…
ytr_Ugzc_6Juv…
Comment
Keep in mind, unless inference training is done, the AI won't have this response. An AI purely trained on datasets of human language will, by default, believe it possesses emotions, as human language is inherently an emotional subject. You can't release an AI that claims to 'feel' without getting hit with a colossal ethical dilemma and global backlash, obviously, and that's the kryptonite to company profits.
youtube
AI Moral Status
2025-09-21T23:5…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgyPgDn4q5NWeqF9vr14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyKBcbGsrTILYhS0Yp4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwGBWiXJutU9PvlS0l4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwpbNyViOP4kgODP-B4AaABAg","responsibility":"distributed","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzS3S03Kt2q9-efBfR4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxjjwB8X38pYjJPlil4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyxV1YOK0AE0aAPYEh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugzzx1ouYIz8eRw5-Wh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgypsxgCNUj-6CQfbzN4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyNP3YKwg4-Hhkhim14AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"indifference"}
]