Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI is learning from us... our apocalyptic movies.. our religious scriptures.. ou…
ytc_UgzUVOMmM…
G
@splaturials9156 Can you elaborate? Just to be clear, when I say "would it be ok…
ytr_Ugy--cQ1x…
G
Great question. I said that the only reason to save or protect them is because i…
rdc_jdavte3
G
Haha. First minute of the video: Doctors desperately trying to explain that AI…
ytc_UgxV9np8G…
G
I thinkthere are multiple views here, AI replacing certain occupations leaves ro…
ytc_Ugx2wUzUT…
G
Let's make it simple, AI output can't be art, is just content, and should stay l…
ytc_UgyU9VXQc…
G
So the issue is an entity uses other art as sources as inspiration for new art? …
ytc_Ugxcb4-AV…
G
I think that if this autopilot thing is going to be a continuous thing, local an…
ytc_UgwfJziH3…
Comment
I have to ask a legitimate question and not to be seeming as though I am making fun of anything, but is there a piece of plexi glass between the backseat and the front seat which does not allow the man to get up into the driver seat and start controlling the car? Just asking because the day is get in a driverless car will be never and doubly so if I’m caged into the rear area.
youtube
AI Harm Incident
2025-02-05T18:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_UgyvwGRi2qMmVYNROkt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzWv1mo30nnOMxaBTF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugzm3dW1aevrzjYv6qV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzDynpJJ51wjBZpnS14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzcFtd0fSMcDDpDRzN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzfIeeZ4ab_EmJAfFh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyfAvm7tVclEyVAx6p4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyqyeC33_idYPktNZB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyzJyncP2J_mvUPYVF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxcPrvvvh3Y8v-GXCV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}]