Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Thats just using AI for support work. It's not AI art. You aren't typing in "wri…
ytr_UgwDiNVZc…
G
now i can understand how uploading for deep faked AI porn of someone is very mes…
ytc_UgzvSH8z-…
G
I mean, it's pretty easy to tell from the dead stares they all have. The first o…
ytc_Ugy0xuxfU…
G
What the f is AI anyway? Computational intelligence? Using machines for predicti…
ytc_UgxMvjvl5…
G
Any artist, given the same tools and enough time, has the potential to recreate …
ytc_Ugz7jxe6Q…
G
@HarleyBeard-pk1do how are you gonna ask a question n not even accept the valid…
ytr_UgySQV5pM…
G
I am of the opinion that if many people have self-driving cars, that most of the…
ytc_UghasmfeH…
G
Just over half way through and no mention of the huge environmental impact of AI…
ytc_UgwhSfpqc…
Comment
been taking ubers for almost a decade and taking taxis my whole life, experienced ONE uber ride ever where the driver was driving recklessly/dangerously. but yeah, let's get rid of all the humans driving us to make a living so that we can instead endanger riders, pedestrians, and other drivers bc the car is being controlled by a dumbass robot. we haven't even made robot VACUUMS with perfect navigation yet, but we're trusting full-sized sedans to drive themselves safely in multiple cities
youtube
AI Harm Incident
2025-04-11T20:3…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyEFpkoaKoOPC0h3Xh4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugx0IN8Eq2LkSggB8pJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxXj5WKej3o_iKbJ0h4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyB4S4pgozgtdcX7Eh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxIO-CxaK9Q66hjzgJ4AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwQnSHNt6aV__Ss7qt4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwydxO6EU9UwcDhuWN4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugyuw2sVz1W77evI0ol4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugw1_VQtEMZqpNFq_Gh4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxbS57Oqzz3b9H6JHB4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"}
]