Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
One of the broadest, but at the same time down to earth, courageous, constructiv…
ytc_UgzBLQsmX…
G
I've watched it again and again for a while now, and I still can't figure it out…
ytc_UgyMrPskA…
G
why dont people stop getting AI to make them things, and actually get AI to TEAC…
ytc_UgwBbufpg…
G
It's called Prompt Engineering, my question, is what was typed into Chat GPT bef…
ytc_UgzmA1wC4…
G
Think about if Ai is doing the physical aspects of physical reality, as in creat…
ytc_UgxBW7qM4…
G
Internet collapse 2000, Housing Collapse 2007, AI collapse 2025 plus BBB and Tar…
ytc_UgwGzq7pN…
G
I really wish AI (really just "we taught computers to make a best guess and it t…
ytc_UgzjMBVwY…
G
Yesterday, i have seen an idiot try to get a commision for AI "art"...
...and he…
ytc_Ugw1qNtHt…
Comment
I wish people would explain the "black box" concept around AI more clearly, instead of just saying, "we don't know how it works" and leaving it at that. Also, I wish that "alignment" weren't about aligning AI to a certain country's values, but more about working across countries to design a system that reinforces basic human interests. I WISH. Sigh.
youtube
AI Moral Status
2025-06-05T06:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | contractualist |
| Policy | regulate |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgyEKsAs70fs6agKxFt4AaABAg","responsibility":"government","reasoning":"mixed","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugx0D_OueL_OPhqe1nd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgymtewyWS_XZazXT1l4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzkVMG8sh6SHqBzdzF4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzHbNDbHiMiMGxPrZ14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzVpYHY3Na906H_sSB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwEcaBzvzJPJOGPpPV4AaABAg","responsibility":"company","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgydraiAlDU8byE70eR4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugw2Igo8uAT5PJFoKk94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwzazSrGptbj3daRYh4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"resignation"}
]