Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Let it hang out with angry feminists (redundant) and they'll turn that smile ups…
ytc_Ugzo41BqT…
G
While I'm with you regarding the AI stuff, can we please stop using serious crim…
ytc_Ugxuvi2PR…
G
@TheButterfr0g I'm just right. Nightshade is a temporary bandaid that might have…
ytr_UgyhCoBFn…
G
Saying LLMs aren't AI is wrong. There is an actual taxonomy here. LLMs are a typ…
ytc_UgwQ3S7rK…
G
I remember seeing an exhibit of the Chinese Terra Cotta warriors, looking at the…
ytc_UgwY09lzU…
G
The only way humanity can survive an AI apocalypse IMO is to decentralize everyt…
ytc_UgyYnGt-l…
G
Damn, now we need an AI lawyer to sue the lawyers for using AI laywers!…
ytc_Ugyzd5O3W…
G
Selling codes to Israel that are used in their criminal military to kill innocen…
ytr_UgzQ7w6qE…
Comment
I would correct -> we have pretty good idea how AI works. On the contrary, when AI decides to do something, we have no easyt means to see the reasoning. That's what the man ment when he said "we have no idea how Ai works"
youtube
AI Moral Status
2025-06-04T15:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugz1eawKb73rGrn3tdp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugw88kDvdiexcU6pIat4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugy3m1jtmLL8LoiUrkd4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwJebRFRcDW79KfK5F4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugwpk-DQr6a2M5LoxcV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugy2tWz1RAlEQVqSFf94AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugz_B_ULID27Pv6Mlzx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugz4TMTm6_vY4kO6TnZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"indifference"},
{"id":"ytc_Ugz3NIfB6hg_h2iNIZx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugz8oYDtXpBmHm9Csjl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"}
]