Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It's a tricky subject and a lot of people seem to think this either will be the …
ytc_Ugz60GV2F…
G
guy looks like a Blade Runner mad scientist . Cool, he is builidng a robot army …
ytc_Ugx_PDd6h…
G
My mom has become an A.I artist, despite her having actual passion for tradition…
ytc_UgzI-ls1q…
G
didn't this happen in the 60s with machines, people were like "with robots doing…
rdc_cz32dni
G
Back here in 2026.. and lol GUESS WHAT AI BASICALLY HAVE NOW?? A conscience. Her…
ytc_Ugy5fC5dG…
G
@LikriArtoh yeah, the more ai slop there is, the more likely it is to infect th…
ytr_UgwzZGYb_…
G
I recently started doing watercolors, purely because I always wanted to learn it…
ytc_UgyLYvy-J…
G
Is wrong with all you people building this??
Tell me it's not going to happen be…
ytc_UgxFkx3ux…
Comment
The only way to protect humanity is the God dilemma. The more one tries to protect a being, the weaker and defenseless it becomes. So, to protect something and make it strong, one need to leave it alone to fend it for itself and intervene very little. An AI projected to protect humanity, in the limit, tends to being still.
youtube
AI Governance
2025-08-26T13:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | deontological |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugy29htWaxqJDB78gQt4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwGYLH5aYrwyIkskcF4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgweOTT0F05j-9FAtnJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxNGyO4loNUPQeZflt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugwhc49xB8f29Y0bMoV4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgyRBOdvVpDVfOrCjSh4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxZQ5z0fSTwahdr1sl4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugy8i3mSArc_JP5FQn54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyyyuyTzLAkpe6heD14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxGT3_jPQeL0SR9SV14AaABAg","responsibility":"user","reasoning":"virtue","policy":"industry_self","emotion":"approval"}
]