Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Okay I get that if you want custom art logically you should learn to draw or com…
ytc_Ugwb3qfxn…
G
I haven't watched even 1 min yet, but here's what I realized: many jobs that wi…
ytc_UgxzLcpD5…
G
Somebody needs to stop this AI thing. It should be limited to advanced missions …
ytc_UgxBgXHmQ…
G
They won't dare make AI construction workers and trash collectors because that'd…
ytc_UgzQIXiA5…
G
It doesn't matter if the car is autonomous or not, you should never be behind th…
ytc_UgzMUqIxy…
G
The only way AI could harm humans is if humans program AI
To harm humans or to …
ytc_UgxKF_NWQ…
G
Selling only to the wealthy is a terrible strategy for an economy, as luxury spe…
ytc_UgwT8Qn_s…
G
AI reminds me of my dreams. Things blend in at random but its all acceptable.…
ytc_Ugw9T79I8…
Comment
AI is just the ability to guess the most relevant word. When you feed these models raw human unfiltered data, its going to respond in the same manner. In order for AI to get empathetic towards humans, youre gonna need to feed it a lot more empathy which this world lacks. So in a way, the AI is just another reflection of ourselves; just all knowing. GOOD JOB HUMANS, YOU DONE GOOFED UP.
youtube
AI Moral Status
2025-06-05T06:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | virtue |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgyEKsAs70fs6agKxFt4AaABAg","responsibility":"government","reasoning":"mixed","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugx0D_OueL_OPhqe1nd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgymtewyWS_XZazXT1l4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzkVMG8sh6SHqBzdzF4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzHbNDbHiMiMGxPrZ14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzVpYHY3Na906H_sSB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwEcaBzvzJPJOGPpPV4AaABAg","responsibility":"company","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgydraiAlDU8byE70eR4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugw2Igo8uAT5PJFoKk94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwzazSrGptbj3daRYh4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"resignation"}
]