Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I feel like as youtuber I think the only we content creator could counter this i…
ytc_UgyC1Kkzy…
G
Sir please make a detailed video on chartered accountants just like you recentl…
ytc_Ugw1glcfQ…
G
Okay guys, a AGI/ASI would likely NOT destroy the entirety of humanity.
Humans a…
ytc_Ugz_ItfeV…
G
The Pentagon's three threats are mutually exclusive. Terminate the contract, des…
rdc_o7ab0zw
G
This is probably true, as far as I understand. The apartments are not very big, …
rdc_gsotl2z
G
Yeah lets let the robot "Gently" put boxes of fragile vegatables. Won't suddenly…
ytc_UgyWha85A…
G
You are correct! CEO should be replaced by AI. The workers will be AI slaves😂.…
ytr_Ugz7o-yop…
G
@AmirDarkOne All tools extend the user, in word, IN WORDS; a tool is used to fa…
ytr_UgxErx5cb…
Comment
It's kind of disappointing that these discussions are always about how superintelligence could actually be really dumb
And i get that's a real possibility, but then it's not really 'superintelligence' as you're using it then, is it?
Like yeah, sure something that works like an LLM could optimize for similar dumb things we do but in a more catastrophic scale and context, but that's a very different thing
youtube
AI Moral Status
2026-02-03T11:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzXWGZdUvm8lCMn11B4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgybMC-zPZz32OH-xwt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyGqpuG2_PG2xriwht4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwYuO0-6xx9-Vl3p-N4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz749USJTIAgFIjdTN4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzGOeRg_baggtUgbLB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugw0ssw-Qj68v1QksT54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxmY10QDM9hOoGILId4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgzyJQ9Tj1cO76_s-G54AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugy-9kEAKOukhQa68Qx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}
]