Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI is going to ruin everything, put people on the streets, movies are gonna be a…
ytc_UgyeawsAR…
G
As a writer/musician who is working on overcoming chronic fatigue syndrome / lon…
ytc_UgxlrdG1O…
G
i think every company involved with an ai causing someone's death should have to…
ytc_UgzOuFNpd…
G
this may officially be the start of the war on AI, or at least the poisoning sam…
ytc_Ugy3G09J3…
G
This sounds like the movie Fortress, where the White Road commander trucks are a…
ytc_Ugx1h0mk-…
G
Ai art can make thing faster and cheaper for enterprise to use. Animation is lim…
ytc_Ugz0coP_i…
G
We appreciate your perspective. If you're interested in exploring more about AI …
ytr_Ugy4N_Kp3…
G
Sounds like lack of imagination and common sense, again from one of them predict…
ytc_Ugy8m0V3F…
Comment
How about drive the car? How have we become so dependent on AI to handle the responsibility of carrying out an operation where lives are vulnerable? Will this get you out of a DUII charge if the car is to blame for killing someone? Auto pilot is for airplanes in the sky where there isn’t a myriad of objects to hit. They don’t use it on the ground to drive the plane to the gate.
youtube
AI Harm Incident
2022-09-06T16:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzIbw6c92D1mwJw7JV4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwvAqn6txKzSTq30Rp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxiBYRQr7-tLUD73Uh4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgztUyK3QzMS63okyoF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxsaB7VhsHPwDEiMfB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugwg2zSxAT6VPSJbZoJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyKxnz4TcPQGMxInv14AaABAg","responsibility":"user","reasoning":"deontological","policy":"industry_self","emotion":"outrage"},
{"id":"ytc_Ugyo1BdMSzpYX1S21CJ4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx1kocpJlEG0x7CbMd4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugz77UtyWjftcu0k1414AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}
]