Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I would say that global leadership will do nothing on climate change. Advances …
rdc_gx6lgzn
G
We have lost the ability to connect with people on any sort of emotional level. …
ytc_UgydHNdSW…
G
This will take away the teacher jobs. Just like all the other jobs Ai is taking.…
ytc_Ugxi3UnqM…
G
I was using "AI" the other day and I said I was going to add a counter to my pro…
ytc_UgyVSgrXw…
G
Bro I made it a dog and after that…… never looked at ai the same…
ytc_Ugw4CjKxO…
G
AI Art is very dependant on handmade work, they take from unconsenting Digital A…
ytc_Ugw6Jzzzj…
G
CHATGPT answers:
Short term effects: Prices rise, Inflation pressure, Consumer …
ytc_Ugzfy4Fa4…
G
I don't think you understand how this works. like Sam said, "it's a tool not a c…
ytr_Ugwios3yo…
Comment
Interesting. Of course we may end up in a case where it is unsafe for any non-self-driving vehicles are on the road, so it would all be software decisions.
But I wanted to comment just in the first few seconds of the video, that if the car can't stop in time, then the software wasn't working properly. It shouldn't be following so close that it couldn't stop in time, I would expect that to be programmed into the rules. Of course it could happen in brief instances, if the truck cut the car off and then immediately thereafter the items fell off the back.
youtube
AI Harm Incident
2015-12-09T12:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgjFBYIvdRYVgHgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UggfFxjEN8s_5ngCoAEC","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugj3QKzIe1Eq-3gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UghHtM6MCJz6TXgCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UggQNW11cKIdvngCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UghwhUYMzBGyVHgCoAEC","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgiBxpBHRTAhPHgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UggPxpiSP8H8OngCoAEC","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgiVRBq_S_B0h3gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugjz5578tI7sb3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"}
]