Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
They use philosophy to fool us. If AI is conscious so are electric toys. Give th…
ytr_Ugy3ivKUg…
G
I believe this will strive very strongly, but not for long. It will get to a poi…
ytc_UgwZPx3LP…
G
You know what's scary.... Robots... are basically humans... who don't get tired.…
ytc_UgwK3RxPw…
G
The techbros still haven't managed to come up with AI robots that can pick crops…
ytc_UgygGw0wh…
G
Americans would never learn from mistakes. Please pay attention to what is going…
ytc_UgwA49mPI…
G
Bro... AI will do none of these things. The only reason it resorted to doing mal…
ytc_UgzUL8aj7…
G
cool video but why was he trying to convince the ai that a hotdog (not a sandwic…
ytc_UgyEG3qxU…
G
AI is the future, but not as is.
Locally, not on a subscription
AI assisted fea…
ytc_UgzZyHPom…
Comment
Why didn't the headlights pick her up at 75 yards instead of at 20? Are the headlights aimed abnormally low to avoid interfering with the autonomous technology? Maybe this video doesn't show the pedestrian the way a normal car would. With normal headlights this looks like an easy miss. Hit the brakes, blow the horn and shake your head as you drive away.
youtube
AI Harm Incident
2018-03-22T04:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwwVjTh9NFjJvuMN4d4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz4viWoSK17RWk-2md4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwOIjISMnaEISw_Z9J4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwJUzDagvk24tFIoNN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwcEpBFlhHJa2NGPWR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgwHFdrbPL6Leg6C2HB4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwvLTvgcr37Uw9vRIV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyV019tyG9R0g7rw7F4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyiyW43qT6sIpx0xW14AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgypBU9Wklx3CdNBpsV4AaABAg","responsibility":"none","reasoning":"contractualist","policy":"regulate","emotion":"approval"}
]