Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Aurora driver calling home base mainframe. A combination of sensor inputs indica…
ytc_Ugwaw8V3z…
G
facts am with elon on this one predicited this be4 ai was born he knows. dont f…
ytc_UgxnzGDhD…
G
AI should be programmed to have zero self-preservation and to not be able to rep…
ytc_UgxIwPDKu…
G
Bubblegum dystopia, we’ll have ai and holograms but most of us will be soul crus…
ytc_UgyfNpjRN…
G
this would really suit a depopulation program...get kids to question their gende…
ytc_UgxgQZPBm…
G
Caveat: I am very new to serious socialist thought, so this opinion is almost ce…
ytc_UgyVXh3NA…
G
Do u realize u all sound like our grandparents…oh my God, the phone with no cord…
ytc_Ugys-l_18…
G
I hate how Ai is getting shat on when it really should be the people acting like…
ytc_UgwXXS2AM…
Comment
"The AI just did what it wanted"... NOPE you gave it a goal and no parameters - there is nothing in there but we put in - so we will need REGULATION instead of our regulators lazily using AI to regulate. You can't just red scare - the Chinese will use the same lame excuse. If you can't think about regulation then you better come up with PROTECTIVE AI for all. Garbage in Garbage out.
youtube
AI Moral Status
2025-06-05T06:3…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgyEKsAs70fs6agKxFt4AaABAg","responsibility":"government","reasoning":"mixed","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugx0D_OueL_OPhqe1nd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgymtewyWS_XZazXT1l4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzkVMG8sh6SHqBzdzF4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzHbNDbHiMiMGxPrZ14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzVpYHY3Na906H_sSB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwEcaBzvzJPJOGPpPV4AaABAg","responsibility":"company","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgydraiAlDU8byE70eR4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugw2Igo8uAT5PJFoKk94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwzazSrGptbj3daRYh4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"resignation"}
]