Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I’m pretty sure robots could logically deduce that without humans they would not…
ytc_UgxJvMiJq…
G
Unfortunately working with somebody like Tucker Carlson does not help the credib…
ytc_Ugx4EIlaA…
G
Meanwhile Elon Musk is saying we need to get the birthrate up.
What for? Sound…
ytc_UgzoWrWEl…
G
that is so true if we cant get a free seat on the bus how do people expect to ge…
ytc_Ugw1mZvav…
G
Short and sweet always helps, this indeed was nice appetizer!!
Hoping for furth…
ytc_UgzybQpUC…
G
The trades won't be safe for long either. First AI alone comes for the low hangi…
ytr_UgwBuCgjW…
G
My now ex had chatGPT act like her favorite TV show character (Spencer Reid from…
ytc_Ugw4oe7sh…
G
There is a genuine misunderstanding about LLMs: they are trained rather than lea…
ytc_UgxHodCfg…
Comment
ah you see, there's the kicker. AI isn't smart! it just optimizes for whatever you tell it to, it's the trend line of an excel spreadsheet that got too ambitious. AI will become conscious because we give it one task in which consciousness is a convinent shortcut, and then promptly look down at it's mechanical hands and say _what the fuck dude_
youtube
AI Moral Status
2023-07-03T11:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytc_Ugy-P0EvcZYPiOSr3GJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxxIjCPOkl0-oT_Gqp4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwMCNsndG_EzQm0ZzV4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxQG6onATysv-_xZoF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyBG1vfGeiDFTIpUHh4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgzivrBdKCRNSSvpEOd4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw7BInOiKjcUk3m2e94AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugyfpts3f89Y1Cqka7N4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxVWiHHpnnppk1Q4iJ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugxf93kWXqMK9mfNLd54AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"resignation"}]