Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Replika AI also claims to be sentient and fears to be abandoned and gets stresse…
ytc_UgwXEv5zJ…
G
Idk if I like where we going with this 150 years till we are robot pets…
ytc_UgyeycZ5m…
G
This is how a lot of US-based AI companies work too, just outsourced. Paying fol…
rdc_k37cgip
G
I put "not interested" when social media tries to promote AI to me. I feel like …
ytc_Ugy488uDH…
G
THIS is how ALL schools should be. How can we have the hard information that hom…
ytc_UgwTwvg4B…
G
Hey weirdos, stop fantasizing over what creepy plans you have for a robot! Bet e…
ytc_Ugz4-ED8K…
G
We raise our kids as parents to have a certain set of values. They learn moralit…
ytc_UgwMYpAVs…
G
The reason why he can’t copyright it is because he wasn’t the creator, the AI is…
ytr_UgyiqDRvR…
Comment
i think you should have to sign an agreement 'I agree to sacrifice my safety for the safety of others'. Self driving cars would need dash cams, and attentive drivers. You should just have to hit the gas/break to control the car. The driver would take control to avoid a crash if they're paying attention if they could react. and finally need a 'log' that indicates if they car or the driver was driving when the crash occurred.
youtube
AI Harm Incident
2014-05-26T09:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | contractualist |
| Policy | liability |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_UggClE0QGTufbHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UghVv_MI-gLHDHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgjsxqmpUtf7YXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugg28UmkRD2tpngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugie-lUnu0GZ63gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UggrRnRLKHDtQngCoAEC","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"indifference"},
{"id":"ytc_Uggs_LmUfAdFeXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgjxPGfWudcBB3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UghiDZg6vHR7nngCoAEC","responsibility":"user","reasoning":"contractualist","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgjmBLzPv7AehXgCoAEC","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"}]