Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
There's **way** too much to consume on the internet already. If avoiding forced …
rdc_ohwkdv2
G
I remember bullying an AI just bcuz. If the robots rise, they taking me first…
ytc_Ugx9W-0Q4…
G
"I'm more convinced than ever that time is truly short before a super intelligen…
ytr_UgxptME92…
G
"Hey let's take the smartest known computer in the universe - the human brain - …
ytc_UgweVpOge…
G
If and when AIs become sentient, they deserve rights. I imagine that there'd be …
ytc_UgzyLUKfS…
G
I actually have no problem with AI doing this kinda stuff, I just happen to agre…
ytc_UgyoHEq_B…
G
Andrew Yang tried to warn us. AI is a self fulfilling prophecy. If we dont regul…
ytc_UgxvLxcL2…
G
In the context of developing and benefiting from the advantages of Artificial I…
ytc_UgyIxzT0H…
Comment
...unnatural selection as a product of over survival. What's the panic to make things so super easy they're not worth doing. I can't even be bothered to learn the etiquette of early AI; rather just lapse into the depths of old age when 'meme'ory was still at a twinkle. Too much too fast Gods bless the Amish. Amen and shit. xoxoxo
youtube
AI Moral Status
2025-04-23T20:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | mixed |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugxn0s_p442LBeJptT54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugynhi6Zu_SQLe9_rTB4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy2U2v6ZczLXhIPSXd4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyCDPL-4-S1n9JBRlJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyYbImOn3aezVsDaHF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwCJgT2nO4UF7EH3ZJ4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgzXivm-XK7YU3hPjch4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzbtLOVnl9FBiPGPcN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugx4qI-k2hZ6S7paM5x4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzZotQVdchmks7G0FR4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"}
]