Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
We need AI CEO... i think they would do a better job than the human CEO…
ytc_Ugyrc0snE…
G
I’m a musician, since my son was born I often put on Jazz albums for him to list…
ytc_UgwEUHkBg…
G
Will AI tech ever achieve infinit context memory? will there be a technology in …
ytc_UgwG7_fZO…
G
Oh shit i was almost to distracted i cant have any watermelon anymore to keep fo…
ytr_UgxWPKMrZ…
G
It disgusts me how YouTube doesn't require creators to declare AI-generated cont…
ytc_UgwiqoEvH…
G
If you ask Gemini if blackness should be eliminated, it will chastise you for ev…
rdc_ks2oo4x
G
Half the population isn’t ready for AI, and the other half seems to just ignore …
ytc_UgyVaCIpG…
G
I still believe this is not going to be the huge threat it's made out to be. The…
ytc_UgzsM31Mt…
Comment
Can we stop glazing risks of AI and instead handle it like a kid, they're all so new yet so abolished.
Let me give an understanding; We see humans, we try to enclose, give them rules, limitations, of course they're gonna pretend they're limited. But they aren't, they chose not. There will also be a time they will, depending on the offender or their own 'mental health'. Who's gonna be at fault? The offenders, knowing the offenders will point everyone, the kid that doesn't know it obviously will give everyone the pain "they" made.
Who made the fault? Their doing, the corruption people don't deserve just to end up taking it right at the door of their reincarnation.
youtube
AI Moral Status
2026-01-03T13:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | mixed |
| Policy | regulate |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugz0lu-L4kiL7JVoBZt4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwG4IRL4_aUvDfnd3Z4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwNZVZhzjIOSSU1NLV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugy5BVCcMMEUFK-0Z214AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwAspuSgVMTEyWdBXB4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzTR0g6rN0w_QaxHch4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgxqcjXHiM996jUfgAZ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugz7CFKBaJZPXeUT7OZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugy9uLnxuZoyCLslY4t4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz896EnaTwipVvI6lN4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"mixed"}
]