Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Who’s to say that the human body and its neurons aren’t themselves a biological …
ytc_UgyzEDdHI…
G
I worked at a startup a few years back designing an automated camera system to s…
rdc_irfgg0l
G
She is so easy to draw. Truly sad why he needs AI to do it.…
ytr_UgwulJnUc…
G
Nope, we've been in recession for a while now, AI is just glossing over a falter…
ytc_UgxYZWu_A…
G
That line of thinking could apply to humans, no? Pain, and other feelings and em…
ytr_UgxMpV4Zv…
G
Why are you guys hiring juniors, should be able to just have your mid and senior…
rdc_oi0nhb7
G
@pauldobson9379 you seem to be ignoring that failure mode analysis is crucial ev…
ytr_UgxDw_hf_…
G
its not that even mad at the use at the use of image ai generators or ai as a …
ytc_Ugz9nzd_-…
Comment
Blah blah blah!! it’s the same things everywhere! If you truly care so much about humanity, then introduce regulations and enforce them. Bind AI applications or development (whatever) to laws at the state level. Establish ethical laws. Create content oversight institutions. If it’s really this uncontrolled, if it’s really this harmful, if it’s really ‘a toy in the hands of a handful of rich snobs whose intentions are unclear’ which can already undress women and girls in public spaces then do that in a single session as well. Talk about this. Take action for this. But I guess putting your hands on your head and pretending to say ‘oh no, what have we done!’ is more dramatic?!
youtube
2026-02-05T16:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzCfp4LQk116kzkupB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzJOLOjCeJQrjuk6pp4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugy7e05jMgxmgEE75EN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyhgkvmCctLt3x65L54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyeApLwLpO-IG3lGGF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugxby2hHo3D5S7WUocx4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwwCpmuUI5u3-Ig0Dt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwagQOz27Pi8XsOTuB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxzxLhZmu2PibnSj794AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugw7gbc4vG4RAaC27k94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]